case study related to legal and ethical issues in use of ict

  • Technology Ethics Cases
  • Markkula Center for Applied Ethics
  • Focus Areas
  • Technology Ethics
  • Technology Ethics Resources

Find case studies illustrating dilemmas in technology and ethics.

Case studies are also available on Internet Ethics .

For permission to reprint cases, submit requests to [email protected] .

Looking to draft your own case studies?  This template provides the basics for writing ethics case studies in technology (though with some modification it could be used in other fields as well).

Ethical questions arise in interactions among students, instructors, administrators, and providers of AI tools.

Which stakeholders might benefit from a new age of VR “travel”? Which stakeholders might be harmed?

Case study on the history making GameStop short and stock price surge that occurred during January 2021.

As PunkSpider is pending re-release, ethical issues are considered about a tool that is able to spot and share vulnerabilities on the web, opening those results to the public.

With URVR recipients can capture and share 360 3D moments and live them out together.

VR rage rooms may provide therapeutic and inexpensive benefits while also raising ethical questions.

A VR dating app intended to help ease the stress and awkwardness of early dating in a safe and comfortable way.

Ethical questions pertaining to the use of proctoring software in online teaching.

Concern is high over possible misuses of new AI technology which generates text without human intervention.

As cities collect, analyze, and share more data, new ethical questions arise.

  • More pages:

Social and Ethical Implications of ICT Use

  • For authors
  • Sponsors & Exhibitors
  • You are here:
  • European Conference on Information Systems - ECIS 2019
  • Research tracks
  • Keynote speakers
  • Best paper nominations
  • Best reviewers
  • IBM Workshop
  • Industry innovation forum
  • Workshops and tutorials
  • Junior faculty consortium
  • AIS women network
  • Ancillary meetings
  • Important dates
  • Call for panels
  • Call for workshops
  • Call for papers
  • Call for tracks

Track Chairs

Christy M.K. Cheung, Associate Professor, Hong Kong Baptist University, China. Email: [email protected]

Ofir Turel, Professor, California State University, Fullerton, USA. Email: [email protected]

Helena Wenninger, Lecturer, Lancaster University, UK. Email: [email protected]

Track Description

Recent years have witnessed a mounting integration of information and communication technology (ICT) in all areas of our lives, transforming the way we work, study, share, play, socialize, and live together as a society. Despite the many personal, educational, and work benefits offered by ICT, its use raises a variety of social and ethical concerns (like technology addiction, cyberbullying, eroded personal relationships, influenced elections, online fraud, Internet vigilantism, invasion of privacy, and infringement of intellectual property right – to mention just a few). While an extensive body of research has emphasized the “bright side” or the positive impact of ICT use, nascent academic research is balancing this view. Thus, research on the “dark side” or the undesirable social and ethical consequences associated with the use of ICT for individuals, organizations and societies is receiving more attention in the light of recent developments (e.g., Ransbotham, Fichman, Gopal, & Gupta, 2016, Majchrzak, Markus, & Wareham, 2016).

The objective of this track is to develop theoretical insight into and a practical understanding on topics and issues that address the potential social and ethical implications of ICT use, with focus on the various unfavourable aspects associated with ICT use. We especially welcome papers that identify and address relevant knowledge gaps in: (1) the nature of the problem under investigation (i.e., ICT use and its associated social/ethical implications), (2) aspects associated with the problem, and (3) potential IT and/or non-IT solutions that can mitigate the problem. Other topics that touch on social and ethical implications of ICT use are equally welcome.

The track is open to all methodological approaches. We invite both full research and research in  progress papers.

Topics of interest include, but are not limited to:

Societal impact of current or emerging technologies or technological trends, e.g., Internet of Things, Internet of People, artificial intelligence, augmented reality, social/mobile computing, etc.

Unethical uses of ICTs in elections, organisations, marketing etc.

Cyberbullying, online harassment, online trolling, and Internet judges

Work stress, overload, addiction, financial victimization, and illegitimate surveillance

Reputation and credibility issues in ICT-based applications

Responsible ICTs innovation

ICT-related unemployment and deskilling

Employee responsibility and autonomy of organizational use of ICTs

The role of ICT in social inclusion/exclusion and educational (in)equality

Strategies and interventions (e.g., IT design, IT use practices, IT management policies, and governance mechanisms) for addressing the societal consequences of ICT use

Incorporating societal concerns in ICT planning and governance

Implications of a digital society where sharing data, information and knowledge is common for governments, businesses and people

Publishing Opportunities in Leading Journals

High quality and relevant papers from this track will be selected for fast-tracked development towards Internet Research (http://www.emeraldinsight.com/loi/intr). Selected papers will need to expand incontent and length in line with the requirements for standard research articles published in the journal.

Although the track co-chairs are committed to guiding the selected papers towards final publication, further reviews may be needed before final publication decision can be made.

Internet Research (IntR) is an international and refereed journal that is indexed and abstracted in major databases (e.g., SSCI, SCI, ABI/INFORM Global). The topics published in IntR are broad and interdisciplinary in nature. The impact factor (2016) and the 5-year impact factor (2016) of the journal is 2.931 and 4.580 respectively.

Track Associate Editors

1. Carmen Leong, Lecturer, University of New South Wales, Australia.

2. Jie Yu, Assistant Professor, The University of Nottingham Ningbo China

3.Irina Heimbach, Assistant Professor, WHU Otto Beisheim School of Management, Germany

4. Amjad Fayoumi, Lecturer, Lancaster University, UK

5. Ben Choi, Assistant Professor, Nanyang Technological University, Singapore

6. Sebastian Schuetz, Assistant Professor, University of Arkansas, USA

7. Lauri Wessel, Assistant Professor, University of Bremen, Germany

8. Mengxiang Li, Assistant Professor, Hong Kong Baptist University, China

9. Ruba Aljafari, Assistant Professor, University of Pittsburgh, USA

10. Sangseok You, Assistant Professor, HEC Paris, France

11. Jens Foerderer, Assistant Professor, University of Mannheim, Germany

12. Yong Liu, Assistant Professor, Aalto University, Finland

13. Tommy Chan, Lecturer, Northumbria University, UK

14.Brian Lee, Assistant Professor, University of Massachusetts Lowell, USA

15. Youngseok Choi, Senior Lecturer (Associate Professor), Brunell University London, UK

16. Dimitra Skoumpopoulou, Senior Lecturer, Northumbria University, UK

17. Zach W. Y. Lee, Assistant Professor, Durham University, UK

Last updated: September 19, 2018 Source: ECIS2019

Responsibility in application of ICT as legal, moral and ethical issues

Ieee account.

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

ISSUES AND CHALLENGES IN THE USE OF INFORMATION COMMUNICATION TECHNOLOGY (ICTs) IN EDUCATION Esoswo

Profile image of Esoswo F R A N C I S C A Ogbomo

ICT has given rise to a host of legal and ethical issues and challenges in the use of ICT for education. Pre-service and in-service teachers as well as students need to know to a reasonable extent about the issues and challenges in the use of ICT for education. As teachers or potential teachers and students, they need to be above reproach. Teachers and students should understand the basic issues (effectiveness, cost, equity, and sustainability,), as well as the challenges (infrastructure related challenges, capacity building challenges, challenges related to financing the cost of ICT use, to mention but few) surrounding the use of ICT in education and then apply those issues as principles in practice.

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024
  • Increase Font Size

22 Legal and ethical issues of using ICT

DR. Geeta R Thakur

Module structure

21.0 Learning Outcomes

21.1 Introduction

21.2 Copyright

21.3 Steps to protect copyright

21.4 Copyright infringement/violation

21.5 Plagiarism

21.6 Prevention from plagiarism

21.7 Hacking

21.8 Prevention from hacking

21.9 Let us sum up

21.0 LEARNING OUTCOMES

After going through this module you will be able to:

  • Tell the meaning of copyright, hacking and plagiarism.
  • Explain how to avoid copyright violation and how to protect copyright.
  • Explain how to prevent from being hacked.
  • Explain how to prevent from indulging in plagiarism.

21.1 INTRODUCTION

With so many people using computers today and many of them being connected to the internet, gives rise to many problems. Information is easily accessible, and so it has become easy to download or copy information. This leads to problems like plagiarism and copyright violation. Many users worry that others will misuse their computers and might steal their data to commit fraud. These are some of the legal and ethical issues related to using ICT and teachers need to have a reasonable amount of information about these issues.

21.2 COPYRIGHT

  • Copyright means that the owner has complete control of what can be done and to what extent with his/her original intellectual work . The original work is protected under copyright and thus it cannot be stolen or copied without the permission of the owner. .
  • “Copyright is a legal device that provides the creator of a work of art or literature, or any work that conveys information or ideas, the right to control how the work is used” (Fishman, 2008).
  • The intent of copyright is to advance the progress of knowledge by giving the author of a work an economic incentive to create new works (Loren, 2000).
  • Copyright provides the owner an exclusive right to reproduce the original work or prepare derivative works from the original work, and distribute it publicly. For e.g. if somebody wants to develop a play on your short story, you have the right to license your work and permit others to use it and get paid for it.
  • A Copyright protects the expression in some tangible form eg: a literary work like novels, stories, dramatic work such as plays, music etc. but it does not protect an idea or the titles of works.
  • The original work is often displayed with the © symbol to display that it is copyrighted. Example: © T. Genne 2007.

Obtaining Copyright:

You have copyright protection as soon as you make the expression in a tangible format.

Copyright is automatic and requires no paperwork.

Works that can be copyrighted:

Tangible, original expressions can be copyrighted whereas a verbal presentation that is not recorded or written down cannot be copyrighted.According to the United States Copyright Office (2008), there are three fundamental requirements for something to be copyrighted:

  • Fixation: The item must be fixed in some way. It can be done by writing something on a piece of paper, posting online, or storing on a computer or phone, or on an audio or video device.
  • Originality:The work must be original. Originality can be a novel or a student’s email message to a professor. Both are considered examples of original expression.
  • Minimal Creativity: The work must include something that is above and beyond the original. A reference to the original work that is used to discuss a new concept would be considered original.

Protection provided by the copyright:

Copyright provides the authors a fairly substantial control over their work and protects it from being misused. The four basic protections that the copyright provides are the right:

  • to create copies of the original work.
  • to sell or distribute copies of the work.
  • to prepare new works based on the protected work.
  • to publish the protected work in public.

Limitations of Copyright:

Following types of work cannot be copyrighted:

  • Works in the public domain which includes:
  • Ideas and facts, as they are intangible.
  • Words, names, slogans, or other short phrases also cannot be copyrighted
  • Government works, which include:
  • Judicial opinions.
  • Public ordinances.
  • Administrative rulings cannot be copyrighted.
  • Works created by federal government employees as part of their official responsibility.

Despite providing maximum protection to the original work, copyright also has some limitations, of which Fair Use is the most significant on the copyright holder’s rights as there are no set guidelines for it. However there are exceptions and limitations also to the copyright law that people are allowed to use copyrighted material for the purpose of criticism, comment, research, news reporting and teaching.

21.3 STEPS TO PROTECT COPYRIGHT

1.Ensure your work is properly marked:

Although a copyright notice is not required, you must display a notice which shows that you have an awareness of copyright and take violations of your work seriously.

2. Register your work:

You should register your work as it helps you to provide a veriable proof in case of copyright violation.

3. Keep or register supporting evidence.

Supporting evidences may help to prove that you are original creator of material. It falls into two categories:

  • Evolution of ideas: This is an evidence of the progression of the work which can be in the form of early drafts, synopsis, rough recordings, sketches, etc.
  • Footprints or watermarking: This is normally an evidence inserted into finished documents.

21.4 COPYRIGHT INFRINGEMENT/VIOLATION

Copyright infringement occurs when someone copies a copyrighted work without permission and either passes it off as his or her own, or uses substantial portions of the work without permission and without fair use. To prove copyright violation, you need to prove that you are the owner of the work and that the work is entitled to copyright protection. This means that your work has the requisite level of originality. If you register and obtain your certificate of copyright within five years of creating the work, then that is evidence of the validity of the copyright. The second thing you need to prove is that there has been copying of your work. When a work becomes available for use without permission from a copyright owner, it is said to be “in the public domain” and this happens because their copyrights have expired.

To know how you will face the copyright issues in the classroom, click on following link.

  • http://www.knowyourcopyrights.org/bm~doc/kycrfaq.pdf
  • http://www.slideshare.net/WCU_Becca/copyright-infringement-8018424

21.7 PLAGIARISM

Plagiarism is when a project is submitted as if it is his own creation, whereas the project was prepared by copying somebody else’s work. . Let us discuss the meaning of plagiarism.

CONCEPT OF PLAGIARISM

According to the Merriam-Webster Online dictionary, to “plagiarize” means

  • To steal one’s original work and pass it off as their own
  • To use someone else’s production without crediting the source
  • To commit literary theft i.e. using the contents of the original work and not citing the original source.

To present your work as new and original idea

In simpler terms, plagiarism involves both stealing someone else’s work and also lying about it afterward. It is the use of another’s original words and ideas as though they were your own.

All of the following are considered as plagiarism:

  • Copying someone else’s work as your own.
  • Failing to put a quotation mark in quotations.
  • Giving incorrect information about the source of a quotation.
  • Changing the words but copying the sentence structure of a source without giving credit.
  •  Copying so many words and ideas from a source that it makes up the majority of your work, whether you give credit or not.

Consequences of plagiarism:

Following are the consequences of plagiarism.

• Failing in the assignment or getting poor or lower grades

• Failing in the class or detention

• Expulsion or restication from school

• Termination from workplace

• Court appearance and fines or may be both, in some cases

• Embarrassment and humiliation faced due to the above mentioned charges.

Detecting plagiarism:

  • Identify distinctive phrases (2-3 words) in student’s papers. Search for them using a search engine such as Google to detect any kind of theft from the original work.
  • Search for a relevant subject using a Web search tool, well-known ‘page mill’ sites under various topics, online databases (EBSCO, ProQuest) and CD-ROM reference tools. Once you find a suspect source, use your browser’s ‘find’ tool to locate distinctive phrases from student papers.

21.8 PREVENTING PLAGIARISM:

  • Plan your paper: If you know you are going to use other sources of information, you need to plan how you are going to use them. You should balance your own ideas and other sources to support it.
  • Take notes: Take notes from all your sources for a research paper so that you have much of the information organized before you begin writing. Prepare for a research paper by taking thorough notes using different colored fonts, pens, or pencils for each one, and making sure that you clearly distinguish your own ideas from those you found elsewhere. Also, get in the habit of marking page numbers, and make sure that you record bibliographic information or web addresses for every source.
  • Make it clear who said what: Even if you cite sources, make sure that when you mix your own ideas with those of your sources, you always clearly distinguish them.
  • Know how to paraphrase: A paraphrase is a restatement in your own words. Changing a few words of the original sentences does NOT make your writing a legitimate paraphrase. You must change both the words and the sentence structure of the original, without changing the content. Paraphrased passages still require citation because the ideas have come from another source, even if you are putting them in your own words.
  • Cite references and use footnote: A “citation” is the way you tell your readers that certain material in your work has come from another source. It also gives your readers the information necessary to find that source again. Giving credit to the original author by citing sources is the only way to use other people’s work without plagiarizing.

21.5 HACKING

Internet security is one of the major fears of computer users. They mainly fear the exposure of their secure documents and information and alsoof the hackers who break into their security system to get information via unethical act. Hacking is illegally accessing someone else’s computer without permission, regardless of the activity or intent. What hackers can do?

  • Invade your privacy
  • Delete information
  • Damage files
  • Impersonate you on your computer
  • Decreases owner’s right to income

21.6 PREVENTION FROM HACKING

  • Tough passwords: You need to have a separate password for each account, so that even if one account gets hacked, all of your important information is not accessed by hackers.. The problem is that it is tough to remember dozens of passwords. To help you with this, there are a variety of third-party software programs that will create and store passwords for you.
  • Authentication: Many email providers offer a “two-factor” authentication option in your settings.When you sign on with your password, a message is sent to your phone that prompts you to enter an additional access code which you can use .
  • Change Your Behavior: Oversharing may cause harm. Things like birth dates and graduation years can be used to access your information and so avoid sharing too many details about your life.
  • Keep back up: Use an external hard drive or an online service. Being hacked can be the gateway to identify theft or worse.
  • Keep your email secure: Your email is the centre of your online life. You should keep it most secure. If your email is hacked, hackers can access your bank accounts as well.
  • Change password frequently and use lengthy password: Your password should be changed every month or every two months and made difficult to guess. The length of your password is more important than complexity. Longer passwords means more work for hacking software and hackers generally want quick results.
  • Update your system: Programs like Acrobat PDF reader, Microsoft and Java are heavily abused by hackers. So keep these programs up to date and uninstall software you no longer use.
  • Use antivirus: Use antivirus programmes as many of them provide protection from spyware, malware and viruses.
  • Stick to secured sites: Web addresses that begin with “http” use the basic Hypertext

Transfer Protocol. But with “https,” the “s” on the end stands for “secure”: It authenticates the website and the Web server you are communicating with.

  • Be email cautious: Many a times, we all receive emails and do not really know who the source is. So never open an email especially an attachment from an unknown source. Infections can come from already-hacked friends, too. Example: A hacker sends an infected message to everyone in the victim’s online address book. Open its attachment and you unwillingly become an infection spreader too. Be suspicious if a friend appears to have sent you an email with no subject line, a subject line that only says “RE” or “FW” or is uncharacteristically vague or brief, especially if the email text contains an Internet link.
  • Be careful what you click. Avoid clicking links that promise free prizes or gifts. Be cautious of third-party security alerts. If you are browsing the Internet and a website’s pop-up tells you have viruses, it could be a trap to get you to download harmful files.Some hackers hire call centers overseas. They claim to be from Microsoft or whatever, they may say “we have detected a virus on your machine; go to this website, download and run this program so we can fix it for you.” This gets them inside your machine.
  • Be cautious of software downloads: If you are getting software at a discount or for free online, there are lots of pirated software out there, and there is the possibility of some sort of malware in it. When you are ordering any kind of software for any device, buy it conventionally like from a manufacturer’s website and not through links.
  • Be cautious about USB flash drives: You may get this as a gift from someone, but it could have some other software stowed away. Once on your computer, it may get access to all your files or infect your information. Any peripheral connected to a computer can infect it
  • Be alert about apps: Be careful what you put on your phone. If you go to a website you do not know, what you are buying for 99 cents could be designed by a hacker anywhere in the world. Always use reputable apps, and select them cautiously.

The number of platforms (like Windows, Apple, and Android) with app stores is increasing; but there are some bad alternative app stores also for Android out there. Users should stick with the official one for their platform.

21.9 LET US SUM UP

Copyright is one of the important legal issues in this technology era. Copyright protects the right of the creator of work which is fixed anyway, creative and original.

Copyright gives the right to make copies of work, to sell or distribute the work, to prepare new work and to perform the work in public.

Copyright can be protected by proper fixation of your work, by marking about copyright on your work, by registering your work and keeping supporting proof of your work.

Plagiarism is an important ethical issue. Pleagiarism is stealing somebody’s work and fail to give creadit of it to the original creator. Right from straightaway copying entire material to paraphrasing somebdy’s work both include in plagiarism. It is essential to plan the writing work, writing in own language, and giving references wherever necessary could be some of the ways to prevent plagiarism.

Hacking is both ethical and legal threatening issue. Hacking is illegally accessing someone else’s computer without permission. For preventing hacking, one should keep difficult and lengthy password, authenticate email, keep back up, keep email safe, be cautious about apps, unsecure website, email, USB drive and should update the system.

“It seemed that when people entered the computer center they left their ethics at the door” Donn Parker,” Rules of Ethics in Information Processing” in Communications of the ACM,1968

It is very essential to inculcate ethics and make people aware about issues like copyright and hacking.

REFERENCES:

  • Lathrop, Ann & Kathleen Foss. Student Cheating and Plagiarism in the Internet Era. Englewood, CO: Libraries Unlimited, 2000.
  • Bowman, Vibiana. The Plagiarism Plague. New York, Neal Schuman, 2004
  • http://www.iage.com/PAplagshort.ppt
  • http://www.newsobserver.com/2014/03/02/3660568/10-easy-ways-to-protect-yourself.html#storylink=cpy
  • Radcliff, Deborah,  Jan,  1999.  Internet  Security  News:  [ISN]  Hackers  for Hire.[Online]                 Available   at:   http://www.landfield.com/isn/mail-archive/1999/Jan/0053.html
  • Wikipedia, the Free Encyclopedia, March, 2004. [Online] Available at: http://en.wikipedia.org/wiki/Hacker
  • Riley, James, 2001. Industry looks to get hacked to bits. [Online] Available at: http://www.consensus.com.au/ITWritersAwards/ITWarchive/ITWentries01/itw 01f-jr-ih36.htm
  • Kapica, Jack, March, 2004. Globetechnology: The syntax of Viruses. [Online] Availableat:http://www.globetechnology.com/servlet/story/RTGAM.20040304.gtka picamar 4/BNStory/Technology/
  • Internet and Network Security, 2004. Introduction to Intrusion Detection Systems (IDS) [Online] Available at: http://netsecurity.about.com/cs/hackertools/a/aa030504_2.htm
  • Internet and Network Security, 2004. Hacker tools – Utilities used by hackers,crackers & phreaks. [Online] Available at: http://netsecurity.about.com/cs/hackertools/
  • Panda Software, 2004. Panda Software – About. [Online] Available at: http://us.pandasoftware.com/about/press/viewNews.aspx?noticia=4842
  • Platform Logic, 2004. SoBigF: Intrusion Prevention. [Online] Available at: http://www.platformlogic.com/solutions/mydoom.asp
  • http://www.umuc.edu/library/libhow/copyright.cfm

McCombs School of Business

  • Español ( Spanish )

Videos Concepts Unwrapped View All 36 short illustrated videos explain behavioral ethics concepts and basic ethics principles. Concepts Unwrapped: Sports Edition View All 10 short videos introduce athletes to behavioral ethics concepts. Ethics Defined (Glossary) View All 58 animated videos - 1 to 2 minutes each - define key ethics terms and concepts. Ethics in Focus View All One-of-a-kind videos highlight the ethical aspects of current and historical subjects. Giving Voice To Values View All Eight short videos present the 7 principles of values-driven leadership from Gentile's Giving Voice to Values. In It To Win View All A documentary and six short videos reveal the behavioral ethics biases in super-lobbyist Jack Abramoff's story. Scandals Illustrated View All 30 videos - one minute each - introduce newsworthy scandals with ethical insights and case studies. Video Series

Case Studies UT Star Icon

Case Studies

More than 70 cases pair ethics concepts with real world situations. From journalism, performing arts, and scientific research to sports, law, and business, these case studies explore current and historic ethical dilemmas, their motivating biases, and their consequences. Each case includes discussion questions, related videos, and a bibliography.

A Million Little Pieces

A Million Little Pieces

James Frey’s popular memoir stirred controversy and media attention after it was revealed to contain numerous exaggerations and fabrications.

Abramoff: Lobbying Congress

Abramoff: Lobbying Congress

Super-lobbyist Abramoff was caught in a scheme to lobby against his own clients. Was a corrupt individual or a corrupt system – or both – to blame?

Apple Suppliers & Labor Practices

Apple Suppliers & Labor Practices

Is tech company Apple, Inc. ethically obligated to oversee the questionable working conditions of other companies further down their supply chain?

Approaching the Presidency: Roosevelt & Taft

Approaching the Presidency: Roosevelt & Taft

Some presidents view their responsibilities in strictly legal terms, others according to duty. Roosevelt and Taft took two extreme approaches.

Appropriating “Hope”

Appropriating “Hope”

Fairey’s portrait of Barack Obama raised debate over the extent to which an artist can use and modify another’s artistic work, yet still call it one’s own.

Arctic Offshore Drilling

Arctic Offshore Drilling

Competing groups frame the debate over oil drilling off Alaska’s coast in varying ways depending on their environmental and economic interests.

Banning Burkas: Freedom or Discrimination?

Banning Burkas: Freedom or Discrimination?

The French law banning women from wearing burkas in public sparked debate about discrimination and freedom of religion.

Birthing Vaccine Skepticism

Birthing Vaccine Skepticism

Wakefield published an article riddled with inaccuracies and conflicts of interest that created significant vaccine hesitancy regarding the MMR vaccine.

Blurred Lines of Copyright

Blurred Lines of Copyright

Marvin Gaye’s Estate won a lawsuit against Robin Thicke and Pharrell Williams for the hit song “Blurred Lines,” which had a similar feel to one of his songs.

Bullfighting: Art or Not?

Bullfighting: Art or Not?

Bullfighting has been a prominent cultural and artistic event for centuries, but in recent decades it has faced increasing criticism for animal rights’ abuse.

Buying Green: Consumer Behavior

Buying Green: Consumer Behavior

Do purchasing green products, such as organic foods and electric cars, give consumers the moral license to indulge in unethical behavior?

Cadavers in Car Safety Research

Cadavers in Car Safety Research

Engineers at Heidelberg University insist that the use of human cadavers in car safety research is ethical because their research can save lives.

Cardinals’ Computer Hacking

Cardinals’ Computer Hacking

St. Louis Cardinals scouting director Chris Correa hacked into the Houston Astros’ webmail system, leading to legal repercussions and a lifetime ban from MLB.

Cheating: Atlanta’s School Scandal

Cheating: Atlanta’s School Scandal

Teachers and administrators at Parks Middle School adjust struggling students’ test scores in an effort to save their school from closure.

Cheating: Sign-Stealing in MLB

Cheating: Sign-Stealing in MLB

The Houston Astros’ sign-stealing scheme rocked the baseball world, leading to a game-changing MLB investigation and fallout.

Cheating: UNC’s Academic Fraud

Cheating: UNC’s Academic Fraud

UNC’s academic fraud scandal uncovered an 18-year scheme of unchecked coursework and fraudulent classes that enabled student-athletes to play sports.

Cheney v. U.S. District Court

Cheney v. U.S. District Court

A controversial case focuses on Justice Scalia’s personal friendship with Vice President Cheney and the possible conflict of interest it poses to the case.

Christina Fallin: “Appropriate Culturation?”

Christina Fallin: “Appropriate Culturation?”

After Fallin posted a picture of herself wearing a Plain’s headdress on social media, uproar emerged over cultural appropriation and Fallin’s intentions.

Climate Change & the Paris Deal

Climate Change & the Paris Deal

While climate change poses many abstract problems, the actions (or inactions) of today’s populations will have tangible effects on future generations.

Cover-Up on Campus

Cover-Up on Campus

While the Baylor University football team was winning on the field, university officials failed to take action when allegations of sexual assault by student athletes emerged.

Covering Female Athletes

Covering Female Athletes

Sports Illustrated stirs controversy when their cover photo of an Olympic skier seems to focus more on her physical appearance than her athletic abilities.

Covering Yourself? Journalists and the Bowl Championship

Covering Yourself? Journalists and the Bowl Championship

Can news outlets covering the Bowl Championship Series fairly report sports news if their own polls were used to create the news?

Cyber Harassment

Cyber Harassment

After a student defames a middle school teacher on social media, the teacher confronts the student in class and posts a video of the confrontation online.

Defending Freedom of Tweets?

Defending Freedom of Tweets?

Running back Rashard Mendenhall receives backlash from fans after criticizing the celebration of the assassination of Osama Bin Laden in a tweet.

Dennis Kozlowski: Living Large

Dennis Kozlowski: Living Large

Dennis Kozlowski was an effective leader for Tyco in his first few years as CEO, but eventually faced criminal charges over his use of company assets.

Digital Downloads

Digital Downloads

File-sharing program Napster sparked debate over the legal and ethical dimensions of downloading unauthorized copies of copyrighted music.

Dr. V’s Magical Putter

Dr. V’s Magical Putter

Journalist Caleb Hannan outed Dr. V as a trans woman, sparking debate over the ethics of Hannan’s reporting, as well its role in Dr. V’s suicide.

East Germany’s Doping Machine

East Germany’s Doping Machine

From 1968 to the late 1980s, East Germany (GDR) doped some 9,000 athletes to gain success in international athletic competitions despite being aware of the unfortunate side effects.

Ebola & American Intervention

Ebola & American Intervention

Did the dispatch of U.S. military units to Liberia to aid in humanitarian relief during the Ebola epidemic help or hinder the process?

Edward Snowden: Traitor or Hero?

Edward Snowden: Traitor or Hero?

Was Edward Snowden’s release of confidential government documents ethically justifiable?

Ethical Pitfalls in Action

Ethical Pitfalls in Action

Why do good people do bad things? Behavioral ethics is the science of moral decision-making, which explores why and how people make the ethical (and unethical) decisions that they do.

Ethical Use of Home DNA Testing

Ethical Use of Home DNA Testing

The rising popularity of at-home DNA testing kits raises questions about privacy and consumer rights.

Flying the Confederate Flag

Flying the Confederate Flag

A heated debate ensues over whether or not the Confederate flag should be removed from the South Carolina State House grounds.

Freedom of Speech on Campus

Freedom of Speech on Campus

In the wake of racially motivated offenses, student protests sparked debate over the roles of free speech, deliberation, and tolerance on campus.

Freedom vs. Duty in Clinical Social Work

Freedom vs. Duty in Clinical Social Work

What should social workers do when their personal values come in conflict with the clients they are meant to serve?

Full Disclosure: Manipulating Donors

Full Disclosure: Manipulating Donors

When an intern witnesses a donor making a large gift to a non-profit organization under misleading circumstances, she struggles with what to do.

Gaming the System: The VA Scandal

Gaming the System: The VA Scandal

The Veterans Administration’s incentives were meant to spur more efficient and productive healthcare, but not all administrators complied as intended.

German Police Battalion 101

German Police Battalion 101

During the Holocaust, ordinary Germans became willing killers even though they could have opted out from murdering their Jewish neighbors.

Head Injuries & American Football

Head Injuries & American Football

Many studies have linked traumatic brain injuries and related conditions to American football, creating controversy around the safety of the sport.

Head Injuries & the NFL

Head Injuries & the NFL

American football is a rough and dangerous game and its impact on the players’ brain health has sparked a hotly contested debate.

Healthcare Obligations: Personal vs. Institutional

Healthcare Obligations: Personal vs. Institutional

A medical doctor must make a difficult decision when informing patients of the effectiveness of flu shots while upholding institutional recommendations.

High Stakes Testing

High Stakes Testing

In the wake of the No Child Left Behind Act, parents, teachers, and school administrators take different positions on how to assess student achievement.

In-FUR-mercials: Advertising & Adoption

In-FUR-mercials: Advertising & Adoption

When the Lied Animal Shelter faces a spike in animal intake, an advertising agency uses its moral imagination to increase pet adoptions.

Krogh & the Watergate Scandal

Krogh & the Watergate Scandal

Egil Krogh was a young lawyer working for the Nixon Administration whose ethics faded from view when asked to play a part in the Watergate break-in.

Limbaugh on Drug Addiction

Limbaugh on Drug Addiction

Radio talk show host Rush Limbaugh argued that drug abuse was a choice, not a disease. He later became addicted to painkillers.

LochteGate

U.S. Olympic swimmer Ryan Lochte’s “over-exaggeration” of an incident at the 2016 Rio Olympics led to very real consequences.

Meet Me at Starbucks

Meet Me at Starbucks

Two black men were arrested after an employee called the police on them, prompting Starbucks to implement “racial-bias” training across all its stores.

Myanmar Amber

Myanmar Amber

Buying amber could potentially fund an ethnic civil war, but refraining allows collectors to acquire important specimens that could be used for research.

Negotiating Bankruptcy

Negotiating Bankruptcy

Bankruptcy lawyer Gellene successfully represented a mining company during a major reorganization, but failed to disclose potential conflicts of interest.

Pao & Gender Bias

Pao & Gender Bias

Ellen Pao stirred debate in the venture capital and tech industries when she filed a lawsuit against her employer on grounds of gender discrimination.

Pardoning Nixon

Pardoning Nixon

One month after Richard Nixon resigned from the presidency, Gerald Ford made the controversial decision to issue Nixon a full pardon.

Patient Autonomy & Informed Consent

Patient Autonomy & Informed Consent

Nursing staff and family members struggle with informed consent when taking care of a patient who has been deemed legally incompetent.

Prenatal Diagnosis & Parental Choice

Prenatal Diagnosis & Parental Choice

Debate has emerged over the ethics of prenatal diagnosis and reproductive freedom in instances where testing has revealed genetic abnormalities.

Reporting on Robin Williams

Reporting on Robin Williams

After Robin Williams took his own life, news media covered the story in great detail, leading many to argue that such reporting violated the family’s privacy.

Responding to Child Migration

Responding to Child Migration

An influx of children migrants posed logistical and ethical dilemmas for U.S. authorities while intensifying ongoing debate about immigration.

Retracting Research: The Case of Chandok v. Klessig

Retracting Research: The Case of Chandok v. Klessig

A researcher makes the difficult decision to retract a published, peer-reviewed article after the original research results cannot be reproduced.

Sacking Social Media in College Sports

Sacking Social Media in College Sports

In the wake of questionable social media use by college athletes, the head coach at University of South Carolina bans his players from using Twitter.

Selling Enron

Selling Enron

Following the deregulation of electricity markets in California, private energy company Enron profited greatly, but at a dire cost.

Snyder v. Phelps

Snyder v. Phelps

Freedom of speech was put on trial in a case involving the Westboro Baptist Church and their protesting at the funeral of U.S. Marine Matthew Snyder.

Something Fishy at the Paralympics

Something Fishy at the Paralympics

Rampant cheating has plagued the Paralympics over the years, compromising the credibility and sportsmanship of Paralympian athletes.

Sports Blogs: The Wild West of Sports Journalism?

Sports Blogs: The Wild West of Sports Journalism?

Deadspin pays an anonymous source for information related to NFL star Brett Favre, sparking debate over the ethics of “checkbook journalism.”

Stangl & the Holocaust

Stangl & the Holocaust

Franz Stangl was the most effective Nazi administrator in Poland, killing nearly one million Jews at Treblinka, but he claimed he was simply following orders.

Teaching Blackface: A Lesson on Stereotypes

Teaching Blackface: A Lesson on Stereotypes

A teacher was put on leave for showing a blackface video during a lesson on racial segregation, sparking discussion over how to teach about stereotypes.

The Astros’ Sign-Stealing Scandal

The Astros’ Sign-Stealing Scandal

The Houston Astros rode a wave of success, culminating in a World Series win, but it all came crashing down when their sign-stealing scheme was revealed.

The Central Park Five

The Central Park Five

Despite the indisputable and overwhelming evidence of the innocence of the Central Park Five, some involved in the case refuse to believe it.

The CIA Leak

The CIA Leak

Legal and political fallout follows from the leak of classified information that led to the identification of CIA agent Valerie Plame.

The Collapse of Barings Bank

The Collapse of Barings Bank

When faced with growing losses, investment banker Nick Leeson took big risks in an attempt to get out from under the losses. He lost.

The Costco Model

The Costco Model

How can companies promote positive treatment of employees and benefit from leading with the best practices? Costco offers a model.

The FBI & Apple Security vs. Privacy

The FBI & Apple Security vs. Privacy

How can tech companies and government organizations strike a balance between maintaining national security and protecting user privacy?

The Miss Saigon Controversy

The Miss Saigon Controversy

When a white actor was cast for the half-French, half-Vietnamese character in the Broadway production of Miss Saigon , debate ensued.

The Sandusky Scandal

The Sandusky Scandal

Following the conviction of assistant coach Jerry Sandusky for sexual abuse, debate continues on how much university officials and head coach Joe Paterno knew of the crimes.

The Varsity Blues Scandal

The Varsity Blues Scandal

A college admissions prep advisor told wealthy parents that while there were front doors into universities and back doors, he had created a side door that was worth exploring.

Therac-25

Providing radiation therapy to cancer patients, Therac-25 had malfunctions that resulted in 6 deaths. Who is accountable when technology causes harm?

Welfare Reform

Welfare Reform

The Welfare Reform Act changed how welfare operated, intensifying debate over the government’s role in supporting the poor through direct aid.

Wells Fargo and Moral Emotions

Wells Fargo and Moral Emotions

In a settlement with regulators, Wells Fargo Bank admitted that it had created as many as two million accounts for customers without their permission.

Stay Informed

Support our work.

Business and the Ethical Implications of Technology: Introduction to the Symposium

  • Published: 13 June 2019
  • Volume 160 , pages 307–317, ( 2019 )

Cite this article

case study related to legal and ethical issues in use of ict

  • Kirsten Martin 1 ,
  • Katie Shilton 2 &
  • Jeffery Smith 3  

159k Accesses

48 Citations

13 Altmetric

Explore all metrics

While the ethics of technology is analyzed across disciplines from science and technology studies (STS), engineering, computer science, critical management studies, and law, less attention is paid to the role that firms and managers play in the design, development, and dissemination of technology across communities and within their firm. Although firms play an important role in the development of technology, and make associated value judgments around its use, it remains open how we should understand the contours of what firms owe society as the rate of technological development accelerates. We focus here on digital technologies: devices that rely on rapidly accelerating digital sensing, storage, and transmission capabilities to intervene in human processes. This symposium focuses on how firms should engage ethical choices in developing and deploying these technologies. In this introduction, we, first, identify themes the symposium articles share and discuss how the set of articles illuminate diverse facets of the intersection of technology and business ethics. Second, we use these themes to explore what business ethics offers to the study of technology and, third, what technology studies offers to the field of business ethics. Each field brings expertise that, together, improves our understanding of the ethical implications of technology. Finally we introduce each of the five papers, suggest future research directions, and interpret their implications for business ethics.

Avoid common mistakes on your manuscript.

Mobile phones track us as we shop at stores and can infer where and when we vote. Algorithms based on commercial data allow firms to sell us products they assume we can afford and avoid showing us products they assume we cannot. Drones watch our neighbors and deliver beverages to fishermen in the middle of a frozen lake. Autonomous vehicles will someday communicate with one another to minimize traffic congestion and thereby energy consumption. Technology has consequences, tests norms, changes what we do or are able to do, acts for us, and makes biased decisions (Friedman and Nissenbaum 1996 ). The use of technology can also have adverse effects on people. Technology can threaten individual autonomy, violate privacy rights (Laczniak and Murphy 2006 ), and directly harm individuals financially and physically. Technologies can also be morally contentious by “forcing deep reflection on personal values and societal norms” (Cole and Banerjee 2013 , p. 555). Technologies have embedded values or politics, as they make some actions easier or more difficult (Winner 1980 ), or even work differently for different groups of people (Shcherbina et al. 2017 ). Technologies also have political consequences by structuring roles and responsibilities in society (Latour 1992 ) and within organizations (Orlikowski and Barley 2001 ), many times with contradictory consequences (Markus and Robey 1988 ).

While the ethics of technology is analyzed across disciplines from science and technology studies (STS), engineering, computer science, critical management studies, and law, less attention is paid to the role that firms and managers play in the design, development, and dissemination of technology across communities and within their firm. As emphasized in a recent Journal of Business Ethics article, Johnson (Johnson 2015 ) notes the possibility of a responsibility gap: the abdication of responsibility around decisions that are made as technology takes on roles and tasks previously afforded to humans. Although firms play an important role in the development of technology, and make associated value judgments around its use, it remains open how we should understand the contours of what firms owe society as the rate of technological development accelerates. We focus here on digital technologies: devices that rely on rapidly accelerating digital sensing, storage, and transmission capabilities to intervene in human processes. Within the symposium, digital technologies are conceptualized to include applications of machine learning, information and communications technologies (ICT), and autonomous agents such as drones. This symposium focuses on how firms should engage ethical choices in developing and deploying these technologies. How ought organizations recognize, negotiate, and govern the values, biases, and power uses of technology? How should the inevitable social costs of technology be shouldered by companies, if at all? And what responsibilities should organizations take for designing, implementing, and investing in technology?

This introduction is organized as follows. First, we identify themes the symposium articles share and discuss how the set of articles illuminate diverse facets of the intersection of technology and business ethics. Second, we use these themes to explore what business ethics offers to the study of technology and, third, what technology studies offers to the field of business ethics. Each field brings expertise that, together, improves our understanding of the ethical implications of technology. Finally we introduce each of the five papers, suggest future research directions, and interpret their implications for business ethics.

Technology and the Scope of Business Ethics

For some it may seem self-evident that the use and application of digital technology is value-laden in that how technology is commercialized conveys a range of commitments on values ranging from freedom and individual autonomy, to transparency and fairness. Each of the contributions to this special issue discusses elements of this starting point. They also—implicitly and explicitly—encourage readers to explore the extent to which technology firms are the proper locus of scrutiny when we think about how technology can be developed in a more ethically grounded fashion.

Technology as Value-Laden

The articles in this special issue largely draw from a long tradition in computer ethics and critical technology studies that sees technology as ethically laden: technology is built from various assumptions that—either implicitly or explicitly—express certain value commitments (Johnson 2015 ; Moor 1985 ; Winner 1980 ). This literature argues that, through affordances—properties of technologies that make some actions easier than others—technological artifacts make abstract values material. Ethical assumptions in technology might take the form of particular biases or values accidentally or purposefully built into a product’s design assumptions, as well as unforeseen outcomes that occur during use (Shilton et al. 2013 ). These issues have taken on much greater concern recently as forms of machine learning and various autonomous digital systems drive an increasing share of decisions made in business and government. The articles in the symposium therefore consider ethical issues in technology design including sources of data, methods of computation, and assumptions in automated decision making, in addition to technology use and outcomes.

A strong example of values-laden technology is the machine learning (ML) algorithms that power autonomous systems. ML technology underlies much of the automation driving business decisions in marketing, operations, and financial management. The algorithms that make up ML systems “learn” by processing large corpi of data. The data upon which algorithms learn, and ultimately render decisions, is a source of ethical challenges. For example, biased data can lead to decisions that discriminate against individuals due to morally arbitrary characteristic, such as race or gender (Danks and London 2017 ; Barocas and Selbst 2016 ). One response to this problem is for companies to think more deliberately about how the data driving automation are selected and assessed to understand discriminatory effects. However, the view that an algorithm or computer program can ever be ‘clean’ feeds into the (mistaken) idea that technology can be neutral. An alternative approach is to frame AI decisions—like all decisions—as biased and capable of making mistakes (Martin 2019 ). The biases can be from the design, the training data, or in the application to human contexts.

Corporate Responsibility for the Ethical Challenges of Technology

It is becoming increasingly accepted that the firms who design and implement technology have moral obligations to proactively address problematic assumptions behind, and outcomes of, new digital technologies. There are two general reasons why this responsibility rests with the firms that develop and commercialize digital technologies. First, in a nascent regulatory environment, the social costs and ethical problems associated with new technologies are not addressed through other institutions. We do not yet have agencies of oversight, independent methods of assessment or third parties that can examine how new digital technologies are designed and applied. This may change, but in the interim, the non-ideal case of responsible technological development is internal restraint, not external oversight. An obvious example of this is the numerous efforts put forth by large firms, such as Microsoft and Google, focused on developing principles or standards for the responsible use of artificial intelligence (AI). There are voices of skepticism that such industry efforts will genuinely focus on the public’s interest; however, it is safe to say that the rate of technological development carries an expectation that firms responsible for innovation are also responsible for showing restraint and judgment in how technology is developed and applied (cf. Smith and Shum 2018 ).

A second reason that new technologies demand greater corporate responsibility is that technologies require attention to ethics during design , and design choices are largely governed by corporations. Design is the projection of how a technology will work in use and includes assumptions as to which users and uses matter and which do not, and how the technology will be used. As STS scholar Akrich notes “…A large part of the work of innovators is that of ‘ inscribing’ this vision of (or prediction about) the world in the technical content of the new object” (Akrich 1992 , p. 208). Engineers and operations directors need to be concerned about how certain values—like transparency, fairness, and economic opportunity—are translated into design decisions.

Because values are implicated during technology design, developers make value judgments as part of their corporate roles. Engineers and developers of technology inscribe visions or preferences of how the world works (Akrich 1992 ; Winner 1980 ). This inscription manifests in choices about how transparent, easy to understand and fix, or inscrutable a technology is (Martin 2019 ), as well as who can use it easily or how it might be misused (Friedman and Nissenbaum 1996 ). Ignoring the value-laden decisions in design does not make them disappear. Philosopher Richard Rudner addresses this in realm of science; for Rudner, scientists as scientists make value judgements; and ignoring value-laden decisions means those decisions are made badly because they are made without much thought or consideration (Rudner 1953 ). In other words, if firms ignore the value implications of design, engineers still make moral decisions; they simply do so without an ethical analysis.

Returning to the example of bias-laden ML algorithms illustrates ways that organizations can work to acknowledge and address those biases through their business practices. For example, acknowledging bias aligns with calls for algorithms to be “explainable” or “interpretable”: capable of being deployed in ways that allow users and affected parties to more fully understand how an algorithm rendered its decisions, including potential biases (cf. Kim and Routledge 2018 ; Kim 2018 ; Selbst and Barocas 2018 ). Explainable and interpretable algorithms require design decisions that carry implications for corporate responsibility. If a design team creates an impenetrable AI-decision, where users are unable to judge or address potential bias or mistakes, then the firm in which that team works can be seen to have responsibility for those decisions (Martin forthcoming).

It follows from these two observations—technology firms operate with nascent external oversight and designers are making value-laden decisions as part of their work in firms—that the most direct means of addressing ethical challenges in new technology is through management decisions within technology firms. The articles in this special issue point out many ways this management might take place. For example, in their paper “A Micro-Ethnographic Study of Big Data Innovation in the Financial Services Sector,” authors Richard Owen and Keren Naa Abeka Arthur give a descriptive account focusing on how an organization makes ethics a selling point of a new financial services platform. Ulrich Leicht-Deobald and his colleagues take a normative tact, writing in “The Challenges of Algorithm-Based HR Decision-Making for Personal Integrity” that firms designing technologies to replace human decision making with algorithms should consider their impact on the personal integrity of humans. Tae Wan Kim and Allan Scheller-Wolf present a case for increased corporate responsibility for what they call technological unemployment : the job losses that will accompany an accelerated pace of automation in the workplace. Their discussion, “Technological Unemployment, Meaning in Life, Purpose of Business and the Future of Stakeholders,” asks what corporations owe not only to employees who directly lose their jobs to technology, but what corporations owe to a future society when they pursue workerless production strategies.

The Interface of Business and Technology Ethics

One of the central insights discussed in the pages of this special issue is that technology-driven firms assume a role in society that demands a consideration of ethical imperatives beyond their financial bottom line. How does a given technology fit within a broader understanding of the purpose of a firm as value creation for a firm and its stakeholders? The contributions to this special issue, directly or indirectly, affirm that neither the efficiencies produced by the use of digital technology, nor enhanced financial return to equity investors solely justify the development, use, or commercialization of a technology. These arguments will not surprise business ethicists, who routinely debate the purpose and responsibilities of for-profit firms. Still, the fact that for-profit firms use new technology and profit from the development of technology raises the question of how the profit-motive impacts the ethics of new digital technology.

One way of addressing this question is to take a cue from other, non-digital technologies. For example, the research, development and commercialization necessary for pharmaceutical products carries ethical considerations for associated entities, whether individual scientists, government agencies, non-governmental organizations, or for-profit companies. Ethical questions include: how are human test subjects treated? How is research data collected and analyzed? How are research efforts funded, and are there any conflicts of interest that could corrupt the scientific validity of that research? Do medical professionals fully understand the costs and benefits of a particular pharmaceutical product? How should new drugs be priced? The special set of ethical issues related to pharmaceutical technology financed through private capital markets include the ones raised above plus a consideration of how the profit-motive, first, creates competing ethical considerations unrelated to pharmaceutical innovation itself, and second, produces social relationships within firms that may compromise the standing responsibilities that individuals and organizations have to the development of pharmaceutical products that support the ideal of patient health.

A parallel story can be told for digital technology. There are some ethical issues that are closely connected to digital technology, such as trust, knowledge, privacy, and individual autonomy. These issues, however, take on a heightened concern when the technologies in question are financed through the profit-motive. We have to be attentive to the extent to which a firm’s inclination to show concern for customer privacy, for instance, can be marginalized when its business model relies on using predictive analytics for advertising purposes (Roose 2019 ). A human resource algorithm that possibly diminishes employee autonomy may be less scrutinized if its use cuts operational expenses in a large, competitive industry. The field of business ethics contributes to the discussion about the responsible use of new technology by illustrating how the interface of the market, profit-motive and the values of technology can be brought into a more stable alignment. Taken together, the contributions in this special issue provide a blueprint for this task. They exemplify the role of technology firmly within the scope of business ethics in that managers and firms can (and should) create and implement technology in a way that remains attentive to the value creation for a firm and its stakeholders including employees, users, customers, and communities.

At the same time, those studying the social aspects of technology need to remain mindful of the special nature—and benefits—of business. Business is a valuable social mechanism to finance large-scale innovation and economic progress. It is hard to imagine that some of the purported benefits of autonomous vehicles, for example, would be on our doorstep if it were not for the presence of nimble, fast-paced private markets in capital and decentralized transportation services. Business is important in the development of technology even if we are concerned about how well it upholds the values of responsible use and application of technology. The challenge taken up by the discussions herein is to explore how we want to configure the future and the role that business can play in that future. Are firms exercising sufficient concern for privacy in the use of technology? What are the human costs associated with relegating more and more decisions to machines, rather than ourselves? Is there an opportunity for further regulatory oversight? If so, in what technological domain? Business ethicists interested in technology need to pay attention to the issues raised by this symposium’s authors and those that study technology need to appreciate the special role that business can play in financing the realization of technology’s potential.

In addition, the articles in this symposium illustrate how the intersection of business ethics and technology ethics illuminates how our conceptions of work—and working—shape the ethics of new technology. The symposium contributions herein have us think critically about how the employment relationship is altered by the use and application of technology. Again, Ulrich Leicht-Deobald and his co-authors prompt an examination of how the traditional HR function is altered by the assistance of machine-learning platforms. Kim and Scheller-Wolf force an examination of what firms using job-automation technologies owe to both displaced and prospective employees, which expands our conventional notions of employee responsibility beyond those who happens to be employed by a particular firm, in a particular industry. Although not exclusively focused on corporate responsibility within the domain of employment, Aurelie Laclercq-Vandelannoitte’s contribution “Is Technological ‘Ill-Being’ Missing from Corporate Responsibility?” encourages readers to think about the implications of “ubiquitous” uses of information technology for future individual well-being and social meaning. There are clear lines between her examination of how uses of technology can adversely impact freedom, privacy and respect and how ethicists and policy makers might re-think firms’ social responsibilities to employees. And, even more pressing, these discussions provide a critical lens for how we think through more fundamental problems such as the rise of work outside of the confines of the traditional employment relationship in the so-called “gig economy” (Kondo and Singer 2019 ).

How Business Ethics Informs Technology Ethics

Business ethics can place current technology challenges into perspective by considering the history of business and markets behaving outside the norms, and the corrections made over time. For example, the online content industry’s claim that changes to the digital marketing ecosystem will kill the industry echoes claims made by steel companies fighting environmental regulation in the 1970s (IAB 2017 ; Lomas 2019 ). Complaints that privacy regulation would curtail innovation echo the automobile industry’s complaints about safety regulation in the 1970s. Here we highlight two areas where business ethics’ understanding of the historical balance between industry desires and pro-social regulation can offer insights on the ethical analysis of technology.

Human Autonomy and Manipulation

There are a host of market actors impacted by the rise of digital technology. Consumers are an obvious case. What we buy and how our identities are created through marketing is, arguably, ground zero for many of the ethical issues discussed by the articles in this symposium. Recent work has begun to examine how technology can undermine the autonomy of consumers or users. For example, many games and online platforms are designed to encourage a dopamine response that makes users want to come back for more (“Technology Designed for Addiction” n.d.). Similar to the high produced by gambling [machines for which have long been designed for maximum addiction (Schüll 2014 )], games and social media products encourage users to seek the interaction’s positive feedback to the point where their lives can be disrupted. Through addictive design patterns, technology firms create a vulnerable consumer (Brenkert 1998 ). Addictive design manipulates consumers and takes advantage of human proclivities to threaten their autonomy.

A second example of manipulation and threatened autonomy is the use of aggregated consumer data to target consumers. Data aggregators can frequently gather enough information about consumers to infer their concerns and desires, and use that information to narrowly and accurately target ads. By pooling diverse information on consumer behavior, such as location data harvested from a phone and Internet browsing behavior tracked by data brokers, consumers can be targeted in ways that undermine individuals’ ability to make a different decision (Susser et al. 2019 ). If marketers infer you are worried about depression based on what you look up or where you go, they can target you with herbal remedies. If marketers guess you are dieting or recently stopped gambling, they can target you with food or casino ads. Business ethics has a long history of examining the ways that marketing strategies target vulnerable populations in a manner that undermines autonomy. A newer, interesting twist on this problem is that these tactics have been extended beyond marketing products into politics and the public sphere. Increasingly, social media and digital marketing platforms are being used to inform and sway debate in the public sphere. The Cambridge Analytica scandal is a well-known example of the use of marketing tactics, including consumer profiling and targeting based on social media data, to influence voters. Such tactics have serious implications for autonomy, because individuals’ political choices can now be influenced as powerfully as their purchasing decisions.

More generally, the articles in this symposium help us understand how the creation and implementation of new technology fits alongside the other pressures experienced within businesses. The articles give us lenses on the relationship between an organization’s culture—its values, processes, commitments, and governance structures—and the challenge of developing and deploying technology in a responsible fashion. There has been some work on how individual developers might or might not make ethical decisions, but very little work on how pressures from organizations and management matter to those decisions. Recent work by Spiekermann et al., for example, set out to study developers, but discovered that corporate cultures around privacy had large impacts on privacy and security design decisions (Spiekermann et al. 2018 ). Studying corporate cultures of ethics, and the complex motivations that managers, in-house lawyers and strategy teams, and developers bring to ethical decision making, is an important area in business ethics, and one upon which the perspectives collected here shed light.

Much of the current discussion around AI, big data, algorithms, and online platforms centers on trust. How can individuals (or governments) trust AI decisions? How do online platforms reinforce or undermine the trust of their users? How is privacy related to trust in firms and trust online? Trust, defined as someone’s willingness to become vulnerable to someone else, is studied at three levels in business ethics: an individual’s general trust disposition, an individual’s trust in a specific firm, and an individual’s institutional trust in a market or community (Pirson et al. 2016 ). Each level is critical to understanding the ethical implications of technology. Trust disposition has been found to impact whether consumers are concerned about privacy: consumers who are generally trusting may have high privacy expectations but lower concerns about bad acts by firms (Turow et al. 2015 ).

Users’ trust in firms can be influenced by how technology is designed and deployed. In particular, design may inspire consumers to overly trust particular technologies. This problem arguably creates a fourth level of trust unique to businesses developing new digital technologies. More and more diagnostic health care decisions, for example, rely upon automated data analysis and algorithmic decision making. Trust is a particularly pressing topic for such applications. Similar concerns exist for autonomous systems in domains such as financial services and transportation. Trust in AI is not simply about whether a system or decision making process will “do” what it purportedly states it will do; rather, trust is about having confidence that when the system does something that we do not fully understand, it will nevertheless be done in a manner that supports in our interests. David Danks ( 2016 ) has argued that such a conception of trust moves beyond mere predictability—which artificial intelligence, by definition, makes difficult—and toward a deeper sense of confidence in the system itself (cf. LaRosa and Danks 2018 ). Finally, more work is needed to identify how technology—e.g., AI decisions, sharing and aggregating data, online platforms, hyper-targeted ads—impact consumers’ institutional trust online. Do consumers see questionable market behavior and begin to distrust an overall market? For example, hearing about privacy violations—the use of a data aggregator—impacts individuals’ institutional trust online and makes consumers less likely to engage with market actors online (Martin 2019 ). The study of technology would benefit from the ongoing conversation about trust in business ethics.

Stakeholder Relations

Technology firms face difficult ethical choices in their supply chain and how products should be developed and sold to customers. For example, technology firms such as Google and Microsoft are openly struggling with whether to create technology for immigration and law enforcement agencies and U.S and international militaries. Search engines and social networks must decide the type of relationship to have with foreign governments. Device companies must decide where gadgets will be manufactured, under what working conditions, and where components will be mined and recycled.

Business ethics offers a robust discussion about whether and how to prioritize the interests of various stakeholders. For example, oil companies debate whether and how to include the claims of environmental groups. Auto companies face claims from unions, suppliers, and shareholders and must navigate all three simultaneously. Clothing manufacturers decide who to partner with for outsourcing. So when cybersecurity firms consider whether to take on foreign governments as clients, their analysis need not be completely new. An ethically attuned approach to cybersecurity will inevitably face the difficult choice of how technology, if at all, should be limited in development, scope, and sale. Similarly, firms developing facial recognition technologies have difficult questions to ask about the viability of those products, if they take seriously the perspective of stakeholders who may find those products an affront to privacy. More research in the ethics of new digital technology should utilize existing work on the ethics of managing stakeholder interests to shed light on the manner in which technology firms should appropriately balance the interests of suppliers, financiers, employees, and customers.

How Technology Ethics Informs Business

Just as business ethics can inform the study of recent challenges in technology ethics, scholars who have studied technology, particularly scholars of sociotechnical systems, can add to the conversation in business ethics. Scholarship in values in design—how social and political values become design decisions—can inform discussions about ethics within firms that develop new technologies. And research in the ethical implications of technology—the social impacts of deployed technologies—can inform discussions of downstream consequences for consumers.

Values in Design

Values in design (ViD) is an umbrella term for research in technology studies, computer ethics, human–computer interaction, information studies, and media studies that focuses on how human and social values ranging from privacy to accessibility to fairness get built into, or excluded from, emerging technologies. Some values in design scholarship analyzes technologies themselves to understand values that they do, or don’t, support well (Brey 2000 ; Friedman and Nissenbaum 1996 ; Winner 1980 ). Other ViD scholars study the people developing technologies to understand their human and organizational motivations and the ways those relate to design decisions (Spiekdermann et al. 2018; JafariNaimi et al. 2015 ; Manders-Huits and Zimmer 2009 ; Shilton 2018 ; Shilton and Greene 2019 ). A third stream of ViD scholarship builds new technologies that purposefully center particular human values or ethics (Friedman et al. 2017 ).

Particularly relevant to business ethics is the way this literature examines how both individually and organizationally held values become translated into design features. The values in design literature points out that the material outputs of technology design processes belong alongside policy and practice decisions as an ethical impact of organizations. In this respect, the values one sees in an organization’s culture and practices are reflected in its approach to the design of technology, either in how that technology is used or how it is created. Similarly, an organization’s approach to technology is a barometer of its implicit and explicit ethical commitments. Apple and Facebook make use of similar data-driven technologies in providing services to their customers; but how those technologies are put to use—within what particular domain and for what purpose—exposes fundamental differences in the ethical commitments to which each company subscribes. As Apple CEO Tim Cook has argued publicly, unlike Facebook, Apple’s business model does not “traffic in your personal life” and will not “monetize [its] customers” (Wong 2018 ). How Facebook and Apple managers understand the boundaries of individual privacy and acceptable infringements on privacy is conveyed in the manner in which their similar technologies are designed and commercialized.

Ethical Implications of Technology and Social Informatics

Technology studies has also developed a robust understanding of technological agency—how technology acts in the world—while also acknowledging the agency of technology users. Scholars who study the ethical implications of technology and social informatics focus on the ways that deployed technology reshapes power relationships, creates moral consequences, reinforces or undercuts ethical principles, and enables or diminishes stakeholder rights and dignity (Martin forthcoming; Kling 1996 ). Importantly, technology studies talks about the intersecting roles of material and non-material actors (Latour 1992 ; Law and Callon 1988 ). Technology, when working in concert with humans, impacts who does what. For example, algorithms influence the delegation of roles and responsibilities within a decision. Depending on how an algorithm is deployed in the world, humans working with their results may have access to the training data (or not), understand how the algorithm reached a conclusion (or not), and have an ability to see the decision relative to similar decisions (or not). Choices about the delegation of tasks between algorithms and individuals may have moral import, as humans with more insight into the components of an algorithmic decision may be better equipped to spot systemic unfairness. Technology studies offers a robust vocabulary for describing where ethics intersect with technology, ranging from design to deployment decisions. While business includes an ongoing discussion about human autonomy as noted above, technology studies adds a conversation about technological agency.

Navigating the Special Issue

The five papers that comprise this thematic symposium range in their concerns from AI and the future of work to big data to surveillance to online cooperative platforms. They explore ethics in the deployment of future technologies, ethics in the relationship between firms and their workers, ethics in the relationship between firms and other firms, and ethical governance of technology use within a firm. All five articles place the responsibility for navigating these difficult ethical issues directly on firms themselves.

Technology and the Future of Employment

Tae Wan Kim and Allan Scheller-Wolf raise a number of important issues related to technologically enabled job automation in their paper “Technological Unemployment, Meaning in Life, Purpose of Business, and the Future of Stakeholders.” They begin by emphasizing what they call an “axiological challenge” posed by job automation. The challenge, simply put, is that trends in job automation (including in manufacturing, the service sector and knowledge-based professions) will likely produce a “crisis in meaning” for individuals. Work—apart from the economic means that it provides—is a deep source of meaning in our lives and a future where work opportunities are increasingly unavailable means that individual citizens will be deprived of the activities that heretofore have defined their social interactions and given their life purpose. If such a future state is likely, as Kim and Scheller-Wolf speculate, what do we expect of corporations who are using the automation strategies that cause “technological unemployment”?

Their answer to this question is complicated, yet instructive. They argue that neither standard shareholder nor stakeholder conceptions of corporate responsibility provide the necessary resources to fully address the crisis in meaning tied to automation. Both approaches fall short because they conceive of corporate responsibility in terms of what is owed to the constituencies that make up the modern firm. But these approaches have little to say about whether there is any entitlement to employment opportunities or whether society is made better off with employment arrangements that provide meaning to individual employees. As such, Kim and Scheller-Wolf posit that there is a second, “teleological challenge” posed by job automation. The moral problem of a future without adequate life-defining employment is something that cannot straightforwardly be answered by existing conceptions of the purpose of the corporation.

Kim and Scheller-Wolf encourage us to think about the future of corporate responsibility with respect to “technological unemployment” by going back to the “Greek agora,” which they take to be in line with some of the premises of stakeholder theory. Displaced workers are neither “employees” nor “community” members in the standard senses of the terms. So, as in ancient Greece, the authors imagine a circumstance where meaningful social interactions are facilitated by corporations who offer “university-like” communities where would-be employees and citizens can participate and collectively deliberate about aspects of the common good, including, but not limited to, how corporations conduct business and how to craft better public policy. This would add a new level of “agency” into their lives and allow them to play an integral role in how business takes place. The restoration of this agency allows individuals to maintain another important sense of meaning in their lives, apart from the work that may have helped define their sense of purpose in prior times. This suggestion is proscriptive and, at times, seems idealistic. But, as with other proposals, such as the recent discussion of taxing job automation, it is part of an important set of conversations that need to be had to creatively imagine the future in light of technological advancement (Porter 2019 ).

The value in this discussion, which frames a distinctive implication for future research, is that it identifies how standard accounts of corporate responsibility are inadequate to justify responsibilities to future workers displaced by automation. It changes the way scholars should understand meaningful work beyond meaning at work to meaning in place of work and sketches an alternative to help build a more comprehensive social response to changing nature of employment that technology will steadily bring.

Technology and Human Well-Being

Aurelie Leclercq-Vandelannoitte’s “Is Employee Technological ‘Ill-Being’ Missing From Corporate Responsibility? The Foucauldian Ethics of Ubiquitous IT Uses in Organizations” explores the employment relationship more conceptually by introducing the concept of “technological ill-being” with the adoption of ubiquitous information technology in the workplace. Leclercq-Vandelannoitte defines technological ill-being as the tension or disconnect between an individual’s social attributes and aspirations when using modern information technology (IT) and the system of norms, rules, and values within the organization. Leclercq-Vandelannoitte asks a series of research questions as to how technological ill-being is framed in organizations, the extent to which managers are aware of the idea, and who is responsible for employees’ technological ill-being.

Leclercq-Vandelannoitte leverages Foucauldian theory and a case study to answer these questions. Foucault offers a rich narrative about the need to protect an individual’s ability to enjoy “free thought from what it silently thinks and so enable it to think differently” (Foucault 1983 , p. 216). The Foucauldian perspective offers an ethical frame by which to analyze ubiquitous IT, where ethics “is a practice of the self in relation to others, through which the self endeavors to act as a moral subject.” Perhaps most importantly, the study, through the lens of Foucault, highlights the importance of self-reflection and engagement as necessary to using IT ethically. An international automotive company provides a theoretically important case of the deployment of ubiquitous IT contemporaneous with strong engagement with corporate social responsibility. The organization offers a unique case in that the geographically dispersed units adopted unique organizational patterns and working arrangements for comparison.

The results illustrate that technological ill-being is not analyzed in broader CSR initiatives but rather as “localized, individual, or internal consequences for some employees.” Further, the blind spot toward employees’ ill-being constitutes an abdication of responsibility, which benefits the firm. The paper has important implications for the corporate responsibility of organizations with regard to the effects of ubiquitous IT on employee well-being—an underexamined area. The author brings to the foreground the value-laden-ness of technology that is deployed within an organization and centers the conversation on employees in particular. Perhaps most importantly, ethical self-engagement becomes a goal for ethical IT implementation and a critical concept to understand technological ill-being. Leclercq-Vandelannoitte frames claims of “unawareness” of the value-laden implications of ubiquitous IT as “the purposeful abdication of responsibility” thereby placing the responsibility for technological ill-being squarely on the firm who deploys the IT. Future work could take the same critical lens toward firms who sell (rather than internally deploy) ubiquitous IT and their responsibility to their consumers.

Technology and Governance

Richard Owen and Keren Naa Abeka Arthur’s “A Micro-Ethnographic Study Of Big Data—Based Innovation In The Financial Services Sector: Governance, Ethics And Organisational Practices” uses a case study of a financial services firm to illustrate how organizations might responsibly govern their uses of big data. This topic is timely, as firms in numerous industries struggle to self-regulate their use of sensitive data about their users. The focus on how a firm achieves ethics-oriented innovation is unusual in the literature and provides important evidence of the factors that influence a firms’ ability to innovate ethically.

The authors describe a company that governs its uses of big data on multiple levels, including through responses to legislation, industry standards, and internal controls. The authors illustrate the ways in which the company strives for ethical data policies that support mutual benefit for their stakeholders. Though the company actively uses customer data to develop new products, the company’s innovation processes explicitly incorporate both customer consent mechanisms, and client and customer feedback. The company also utilizes derived, non-identifiable data for developing new insights and products, rather than using customers’ identifiable data for innovation. The authors describe how national regulation, while not directly applicable to the big data innovations studied, guided the company’s data governance by creating a culture of compliance with national data privacy protections. This has important consequences for both regulators and consumers. This finding implies that what the authors refer to as “contextual” legislation—law that governs other marginally related data operations within the firm—can positively influence new innovations, as well. The authors write that contextual data protection legislation was internalized by the company and “progressively embedded” into future innovation.

The authors also found that company employees directly linked ethical values with the success of the company, highlighting consumer trust as critical to both individual job security and organizational success. This finding speaks to the importance of corporate culture in setting the values incorporated into technology design. Owen & Arthur use the company’s practices as a case study to begin to define ethical and responsible financial big data innovation. Their evidence supports frameworks for responsible innovation that emphasize stakeholder engagement, anticipatory ethics, reflexivity on design teams, and deliberative processes embedded in development practice.

Technology and Personal Integrity

Ulrich Leicht-Deobald and his colleagues unpack the responsibilities organizations have to their workers when adopting and implementing new data collection and behavior analysis tools in “The Challenges of Algorithm-based HR Decision-making for Personal Integrity.” It unites theory from business ethics and the growing field of critical algorithm and big data studies to study the topical issue of algorithmic management of workers by human resource departments. The authors focus on tools for human resources decision making that monitor employees and use algorithms and machine learning to make assessments, such as algorithmic hiring and fraud monitoring tools. The authors argue that, in addition to well-documented problems with bias and fairness, such algorithmic tools have the potential to undermine employees’ personal integrity, which they define as consistency between convictions, words, and actions. The authors argue that algorithmic hiring technologies threaten a fundamental human value by shifting employees to a compliance mindset. Their paper demonstrates how algorithmic HR tools undermine employees’ personal integrity by encouraging blind trust in rules and discouraging moral imagination. The authors argue that the consequences of such undermining include increased information asymmetries between management and employees. The authors classify HR decision making as an issue of corporate responsibility and suggest that companies that wish to use predictive HR technologies must take mitigation measures. The authors suggest participatory design of algorithms, in which employees would be stakeholders in the design process, as one possible mitigative tactic. The authors also advocate for critical data literacy for managers and workers, and adherence to private regulatory regimes such as the Association of Computing Machinery’s (ACM) code of ethics and professional conduct and the Toronto Declaration of Machine Learning.

This paper makes an important contribution to the scoping of corporate responsibility for the algorithmic age. By arguing that companies using hiring algorithms have a moral duty to protect their workers’ personal integrity, it places the ethical dimensions of the design and deployment of algorithms alongside more traditional corporate duties such as responsibility for worker safety and wellness. And like Owen and Arthur, the authors believe that attention to ethics in design—here framed as expanding employees’ capacity for moral imagination—will open up spaces for reflection and ethical discourse within companies.

Technology and Trust

Livia Levine’s “Digital Trust and Cooperation with an Integrative Digital Social Contract” focuses on digital business communities and the role of the members in creating communities of trust. Levine notes that digital business communities, such as online markets or business social networking communities, have all the markers of a moral community as conceived by Donaldson and Dunfee in their Integrative Social Contract Theory (ISCT) (Donaldson and Dunfee 1999 ): these individuals in the community form relationships which generate authentic ethical norms. Digital business communities, on the other hand, differ in that participants cannot always identify each other and do not always have the legal or social means to punish participant businesses who renege on the community’s norms.

By identifying the hypernorm of “the efficient pursuit of aggregate economic welfare,” which would transcend communities and provide guidance for the development of micronorms in a community, Levine then focuses on trust and cooperation micronorms. Levine shows that trust and cooperation are “an instantiation of the hypernorm of necessary social efficiency and that authentic microsocial norms developed for the ends of trust and cooperation are morally binding for members of the community.” Levine uses a few examples, such as Wikipedia, open-source software, online reviews, and Reddit, to illustrate micronorms at play. In addition, Levine illustrates how the ideas of community and moral free space should be applied in new arenas including online.

The paper has important implications for both members of the social contract community and platforms that host the community to develop norms focused on trust and cooperation. First, the idea of community has traditionally been applied to people who know each other. However, Levine makes a compelling case as to why community can and should be applied for groups online of strangers—strangers in real life, but known online. Future research could explore the responsibilities of platforms who facilitate or hinder the development of authentic norms for communities on their service. For example, if a gaming platform is seen as a community of gamers, then what are the obligations of the gaming platform to enforce hypernorms and support the development of authentic micronorms within communities? Levine’s approach opens up many avenues to apply the ideas behind ISCT in new areas.

While each discussion in this symposium offers a specific, stand-alone contribution to the ongoing debate about the ethics of the digital economy, the five larger themes addressed by the articles—the future of employment, personal identity and integrity, governance and trust—will likely continue to occupy scholars’ attention for the foreseeable future. More importantly, the diversity of theoretical perspectives and methods represented within this issue is illustrative of the how the ethical challenges presented by new information technologies are likely best understood through continued cross-disciplinary conversations with engineers, legal theorists, philosophers, organizational behaviorists, and information scientists.

Akrich, M. (1992). The de-scription of technological objects. In W. Bijker & J. Law (Eds.), Shaping technology/building society: Studies in sociotechnical change (pp. 205–224). Cambridge, MA: MIT Press.

Google Scholar  

Barocas, S. I., & Selbst, A. W. (2016). Big data’s disparate impact. California Law Review, 104, 671–733.

Brenkert, G. G. (1998). Marketing and the vulnerable. The Ruffin Series of the Society for Business Ethics , 1 , 7–20.

Brey, P. (2000). Method in computer ethics: Towards a multi-level interdisciplinary approach. Ethics and Information Technology , 2 (2), 125–129.

Article   Google Scholar  

Cole, B. M., & Banerjee, P. M. (2013). Morally contentious technology-field intersections: The case of biotechnology in the United States. Journal of Business Ethics, 115 (3), 555–574.

Danks, D. (2016). Finding trust and understanding in autonomous systems. The Conversation . Retrieved from https://theconversation.com/finding-trust-and-understanding-in-autonomous-technologies-70245

Danks, D., & London, A. J. (2017). Algorithmic bias in autonomous systems. Proceedings of the 26th International Joint Conference on Artificial Intelligence . Retrieved from https://www.cmu.edu/dietrich/philosophy/docs/london/IJCAI17-AlgorithmicBias-Distrib.pdf

Donaldson, T., & Dunfee, T. W. (1999). Ties that bind: A social contracts approach to business ethics . Harvard Business Press.

Foucault, M. (1983). The subject and power. In H. Dreyfus & P. Rabinow (Eds.), Michel Foucault: Beyond structuralism and hermeneutics (2nd ed., pp. 208–228). Chicago: University of Chicago Press.

Friedman, B., Hendry, D. G., & Borning, A. (2017). A survey of value sensitive design methods. Foundations and Trends® in Human–Computer Interaction, 11 (2), 63–125.

Friedman, B., & Nissenbaum, H. (1996). Bias in computer systems. ACM Transactions on Information Systems (TOIS), 14 (3), 330–347.

IAB. (2017). The economic value of the advertising-supported Internet Ecosystem. https://www.iab.com/insights/economic-value-advertising-supported-internet-ecosystem/

JafariNaimi, N., Nathan, L., & Hargraves, I. (2015). Values as hypotheses: design, inquiry, and the service of values. Design issues, 31 (4), 91–104.

Johnson, D. G. (2015). Technology with no human responsibility? Journal of Business Ethics, 127 (4), 707.

Kim, T. W. (2018). Explainable artificial intelligence, the goodness criteria and the grasp-ability test. Retrieved from https://arxiv.org/abs/1810.09598

Kim, T. W., & Routledge, B. R. (2018). Informational privacy, a right to explanation and interpretable AI. 2018 IEEE Symposium on Privacy - Aware Computing . https://doi.org/10.1109/pac.2018.00013

Kling, R. (1996). Computerization and controversy: value conflicts and social choices . San Diego: Academic Press.

Kondo, A., & Singer, A. (2019 April 3). Labor without employment. Regulatory Review . Retrieved from https://www.theregreview.org/2019/04/03/kondo-singer-labor-without-employment/

Laczniak, G. R., & Murphy, P. E. (2006). Marketing, consumers and technology. Business Ethics Quarterly, 16 (3), 313–321.

LaRosa, E., & Danks, D. (2018). Impacts on trust of healthcare AI. Proceedings of the 2018 AAAI/ACM conference on artificial intelligence, ethics, and society . https://doi.org/10.1145/3278721.3278771

Latour, B. (1992). Where are the missing masses? The sociology of a few mundane artifacts. In W. Bijker & J. Law (Eds.), Shaping technology/building society: Studies in sociotechnical change (pp. 225–258). Cambridge, MA: MIT Press.

Law, J., & Callon, M. (1988). Engineering and sociology in a military aircraft project: A network analysis of technological change. Social Problems, 35 (3), 284–297. https://doi.org/10.2307/800623 .

Lomas, N. (2019). Even the IAB warned adtech risks EU privacy rules. Tech Crunch. https://techcrunch.com/2019/02/21/even-the-iab-warned-adtech-risks-eu-privacy-rules/

Manders-Huits, N., & Zimmer, M. (2009). Values and pragmatic action: The challenges of introducing ethical intelligence in technical design communities. International Review of Information Ethics, 10 (2), 37–45.

Markus, M. L., & Robey, D. (1988). Information technology and organizational change: Causal structure in theory and research. Management Science, 34 (5), 583–598.

Martin, K. (2019). Designing Ethical Algorithms. MIS Quarterly Executive , June .

Martin, K. (Forthcoming). Ethics and accountability of algorithms. Journal of Business Ethics .

Moor, J. H. (1985). What is computer ethics? Metaphilosophy , 16 (4), 266–275.

Orlikowski, W. J., & Barley, S. R. (2001). Technology and institutions: What can research on information technology and research on organizations learn from each other? MIS Quarterly, 25 (2), 145–165.

Pirson, M., Martin, K., & Parmar, B. (2016). Public trust in business and its determinants. Business & Society . https://doi.org/10.1177/0007650316647950 .

Porter, E. (2019 February 23). Don’t fight the robots, tax them. New York Times. Retrieved from https://www.nytimes.com/2019/02/23/sunday-review/tax-artificial-intelligence.html

Rose, K. (2019 January 30). Maybe only tim cooke can fix Facebook’s privacy problem. Retrieved from https://www.nytimes.com/2019/01/30/technology/facebook-privacy-apple-tim-cook.html

Rudner, R. (1953). The scientist qua scientist makes value judgments. Philosophy of Science, 20 (1), 1–6.

Schüll, N. D. (2014). Addiction by design: Machine gambling in Las Vegas (Reprint edition) . Princeton: Princeton University Press.

Selbst, A. D., & Barocas, S. I. (2018). The intuitive appeal of explainable machines. Fordham Law Review, 87, 1085–1140.

Shcherbina, A., Mattsson, C. M., Waggott, D., Salisbury, H., Christle, J. W., Hastie, T., … Ashley, E. A. (2017). Accuracy in Wrist-Worn, sensor-based measurements of heart rate and energy expenditure in a diverse cohort. Journal of Personalized Medicine , 7 (2), 3. https://doi.org/10.3390/jpm7020003

Shilton, K. (2018). Engaging values despite neutrality: Challenges and approaches to values reflection during the design of internet infrastructure. Science, Technology and Human Values, 43 (2), 247–269.

Shilton, K., & Greene, D. (2019). Linking platforms, practices, and developer ethics: Levers for privacy discourse in mobile application development. Journal of Business Ethics, 155 (1), 131–146.

Shilton, K., Koepfler, J. A., & Fleischmann, K. R. (2013). Charting sociotechnical dimensions of values for design research. The Information Society, 29 (5), 259–271.

Smith, B., & Shum, H. (2018). The future computed: Artificial intelligence and its role in society . Retrieved from https://blogs.microsoft.com/blog/2018/01/17/future-computed-artificial-intelligence-role-society/

Spiekermann, S., Korunovska, J., & Langheinrich, M. (2018). Inside the organization: Why privacy and security engineering is a challenge for engineers[40pt]. Proceedings of the IEEE , PP (99), 1–16.

Susser, D., Roessler, B., & Nissenbaum, H. (2019). Online Manipulation: Hidden Influences in a Digital World. Available at SSRN 3306006 . https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3306006

Turow, J., Hennessy, M., & Draper, N. (2015). The tradeoff fallacy: how marketers are misrepresenting american consumers and opening them up to exploitation. Annenburg School of Communication. https://www.asc.upenn.edu/sites/default/files/TradeoffFallacy_1.pdf .

Winner, L. (1980). Do artifacts have politics? Daedalus, 109 (1), 121–136.

Wong, J. (2018 March 28). Apple’s tim cook rebukes Zuckerberg over Facebook’s business model. The Guardian . Retrieved from https://www.theguardian.com/technology/2018/mar/28/facebook-apple-tim-cook-zuckerberg-business-model

Zuckerberg, M. (2019 March 30). The internet needs new rules. Washington Post. Retrieved from https://www.washingtonpost.com/opinions/mark-zuckerberg-the-internet-needs-new-rules-lets-start-in-these-four-areas/2019/03/29/

Download references

Author information

Authors and affiliations.

George Washington University, Washington, DC, USA

Kirsten Martin

University of Maryland, College Park, MS, USA

Katie Shilton

Seattle University, Seattle, WA, USA

Jeffery Smith

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Kirsten Martin .

Ethics declarations

Animal and human rights.

The authors conducted no research on human participants or animals.

Conflict of interest

The authors declare that they have no conflict of interest.

Informed Consent

The authors had no reason to receive informed consent (no empirical research).

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Martin, K., Shilton, K. & Smith, J. Business and the Ethical Implications of Technology: Introduction to the Symposium. J Bus Ethics 160 , 307–317 (2019). https://doi.org/10.1007/s10551-019-04213-9

Download citation

Received : 22 May 2019

Accepted : 28 May 2019

Published : 13 June 2019

Issue Date : December 2019

DOI : https://doi.org/10.1007/s10551-019-04213-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Socio-technical systems
  • Science and technology studies
  • Values in design
  • Social contract theory
  • Find a journal
  • Publish with us
  • Track your research
  • Trending Now
  • Foundational Courses
  • Data Science
  • Practice Problem
  • Machine Learning
  • System Design
  • DevOps Tutorial

Ethical Issues in Information Technology (IT)

  • Information Technology Act, 2000 (India)
  • Impacts of Covid-19 on Information Technology (IT) Industry
  • Information Security | Confidentiality
  • Information System and Security
  • Some Important terms in Ethical Hacking
  • What is Information Security?
  • Top 5 Information Security Breaches
  • Information Security | Integrity
  • Information Security and Cyber Laws
  • Top 5 Reasons to Learn Ethical Hacking
  • Approaches to Information Security Implementation
  • Big Data as a Technology
  • 5 Ways to Succeed In Technology Sector
  • Applications of Information Technology
  • Information Technology Infrastructure Library (ITIL)
  • Difference between Information System and Information Technology
  • Introduction to Blockchain technology | Set 1
  • Ethical Issues in International Relations and Funding
  • Electronic Governance : Information Technology Act, 2000
  • Difference between Information Technology and Communication Technology
  • Impact of Technology on Society
  • Cool Technology Facts
  • Information Technology Act, 2000: Elements, Applicability and Amendments
  • What is System Hacking in Ethical Hacking?
  • Difference between EIS and Traditional Information System
  • Right To Information (Amendment) Bill 2019
  • Right to Information (RTI) Act, 2005
  • Information Assurance vs Information Security
  • Cyber Security in Context to Organisations

Information Technology specifies to the components that are used to store, fetch and manipulate the information at the minimum level with the server having an operating system. Information Technology have a wide area of applications in education, business, health, industries, banking sector and scientific research at a large level. With the leading advancement in information technology, it is necessary to have the knowledge of security issues, privacy issues and main negative impacts of IT. To deal with these issues in IT society it is important to find out the ethical issues.

Some of the major ethical issues faced by Information Technology (IT) are:

These are explained with their affects as following below:

  • Personal Privacy: It is an important aspect of ethical issues in information technology. IT facilitates the users having their own hardware, operating system and software tools to access the servers that are connected to each other and to the users by a network. Due to the distribution of the network on a large scale, data or information transfer in a big amount takes place which leads to the hidden chances of disclosing information and violating the privacy of any individuals or a group. It is a major challenge for IT society and organizations to maintain the privacy and integrity of data. Accidental disclosure to inappropriate individuals and provisions to protect the accuracy of data also comes in the privacy issue.
  • Access Right: The second aspect of ethical issues in information technology is access right. Access right becomes a high priority issue for the IT and cyberspace with the great advancement in technology. E-commerce and Electronic payment systems evolution on the internet heightened this issue for various corporate organizations and government agencies. Network on the internet cannot be made secure from unauthorized access. Generally, the intrusion detection system are used to determine whether the user is an intruder or an appropriate user.
  • Harmful Actions: Harmful actions in the computer ethics refers to the damage or negative consequences to the IT such as loss of important information, loss of property, loss of ownership, destruction of property and undesirable substantial impacts. This principle of ethical conduct restricts any outsiders from the use of information technology in manner which leads to any loss to any of the users, employees, employers and the general public. Typically, these actions comprises of the intentional destruction or alteration of files and program which drives a serious loss of resources. To recover from the harmful actions extra time and efforts are required to remove the viruses from the computer systems.
  • Patents: It is more difficult to deal with these types of ethical issues. A patent can preserve the unique and secret aspect of an idea. Obtaining a patent is very difficult as compared with obtaining a copyright. A thorough disclosure is required with the software. The patent holder has to reveal the full details of a program to a proficient programmer for building a program.
  • Copyright: The information security specialists are to be familiar with necessary concept of the copyright law. Copyright law works as a very powerful legal tool in protecting computer software, both before a security breach and surely after a security breach. This type of breach could be the mishandling and misuse of data, computer programs, documentation and similar material. In many countries, copyright legislation is amended or revised to provide explicit laws to protect computer programs.
  • Trade Secrets: Trade secrets is also a significant ethical issue in information technology. A trade secret secures something of value and usefulness. This law protects the private aspects of ideas which is known only to the discover or his confidants. Once disclosed, trade secret is lost as such and is only protected by the law for trade secrets. The application of trade secret law is very broad in the computer range, where even a slight head start in the advancement of software or hardware can provide a significant competitive influence.
  • Liability: One should be aware of the liability issue in making ethical decisions. Software developer makes promises and assertions to the user about the nature and quality of the product that can be restricted as an express warranty. Programmers or retailers possess the legitimate to determine the express warranties. Thus they have to be practical when they define any claims and predictions about the capacities, quality and nature of their software or hardware. Every word they say about their product may be as legally valid as stated in written. All agreements should be in writing to protect against liability. A disclaimer of express warranties can free a supplier from being held responsible of informal, speculative statements or forecasting made during the agreement stages.
  • Piracy: Piracy is an activity in which the creation of illegal copy of the software is made. It is entirely up to the owner of the software as to whether or not users can make backup copies of their software. As laws made for copyright protection are evolving, also legislation that would stop unauthorized duplication of software is in consideration. The software industry is prepared to do encounter against software piracy. The courts are dealing with an increasing number of actions concerning the protection of software.

Please Login to comment...

Similar reads.

  • Information-Security

advertisewithusBannerImg

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

IMAGES

  1. (PDF) Ethical Issues in the Use and implementation of ICT

    case study related to legal and ethical issues in use of ict

  2. Legal and Ethical Issues in ICT

    case study related to legal and ethical issues in use of ict

  3. Social,Ethical and Legal issues of ICT in Education

    case study related to legal and ethical issues in use of ict

  4. Ethical Issues In ICT

    case study related to legal and ethical issues in use of ict

  5. Ethical Issues In ICT

    case study related to legal and ethical issues in use of ict

  6. Legal, Ethical and Societal Issues in Media, Information and Technology

    case study related to legal and ethical issues in use of ict

VIDEO

  1. Medical, Legal, Ethical Issues for Paramedics part 2

  2. 16 240411Th 통신 및 네트워크 개론 (학부 24-1) NGCN301/EECE442

  3. Students Presentation Case Study

  4. Ethical Issues in Ict

  5. Legal & Ethical Issues in Supervision: Current Standards and Prospective Changes

  6. The Case of the Chaotic Library

COMMENTS

  1. The Ethical and Social Issues of Information Technology: A Case Study

    Abstract —The present study is conducted among 283 students. from University of Zabol to identify the harm and ethical and. social issues i n the field of information tec hnology and to classify ...

  2. Technology Ethics Cases

    Case studies are also available on Internet Ethics. For permission to reprint cases, submit requests to [email protected]. Looking to draft your own case studies? This template provides the basics for writing ethics case studies in technology (though with some modification it could be used in other fields as well).

  3. The Challenges of IoT Addressing Security, Ethics, Privacy, and Laws

    Ethical issues. There are certain challenges and issues posed in the ethical domain of IoT. They are derived from the central ICT ethical issues comprising accessibility, privacy, property, and integrity of information. The primary ethical issues are examined in this section. 1. Difficult Identification: Objects need to be identified to connect ...

  4. Case Studies of Ethical Issues and Human Behaviour in ...

    Chapter 8 applies the ethics and human behaviour framework to seven case studies from China, Germany, India, the UK and USA. Four of the case studies are related to information and communication technologies (ICT), two cover other issues, related to car testing and genome modified babies. The remaining case study on the Bhopal chemical plant ...

  5. The inclusive analysis of ICT ethical issues on healthy society: a

    The work outlined here seeks to acknowledge the effects and provide feedback of an ethical issue on key areas. The study also provides information about the several concrete solutions to this issue in order to ensure the sustainable development of society. ... (ICICT-2020) The inclusive analysis of ICT ethical issues on healthy society: a ...

  6. Case Studies of Ethical and Human Behaviour Issues in ICT and

    Technology development raises a number of ethical issues, as well as issues related to behaviour. This chapter discusses some of the issues that arose in the information and communications technology (ICT) and automation industries in Poland using seven case studies of real situations and the ethics and behaviour framework presented in Chap. 2.The issues discussed are typical of some of the ...

  7. (PDF) Ethical, Legal and Social aspects of Information and

    Ethical, Legal and Social aspects of Inform ation. and Communication Technology. Minati Mishra. Dept of ICT, FM University, Balasore. Abstract: In this era of computers and communication ...

  8. Legal and Ethical Issues in Educational Technology

    The education community's potential legal issues include plagiarism, copyright, fair use, and safety and privacy. Thus, the chapter aims to inform the readers about these legal issues. This way, the writer seeks to assist the audience in addressing these issues as they present themselves in schools. Similar to legal issues, ethical issues in ...

  9. PDF ETHICAL, PSYCHOLOGICAL AND SOCIETAL PROBLEMS OF THE APPLICATION OF ICTs

    The papers included in this volume describe and analyze social, ethical and legal issues that have arisen as information and communications technologies (ICTs) have been used in education. The aim of the volume is to offer ideas and perspectives that will be helpful in steering the future development and use of ICTs.

  10. Social and Ethical Implications of ICT Use

    Email: [email protected]. Track Description. Social and Ethical Implications of ICT Use. Recent years have witnessed a mounting integration of information and communication technology (ICT) in all areas of our lives, transforming the way we work, study, share, play, socialize, and live together as a society.

  11. Ethical Issues in the Use and implementation of ICT

    Keywords: computer ethics, cyber ethics, ICT, SMEs, municipal companies Literature review Computer ethics is a branch of applied ethics that considers ethical issues raised by computer technology.

  12. Responsibility in application of ICT as legal, moral and ethical issues

    In this paper we address one of the most important aspects of using ICT, collective and individual responsibility of computer experts as legal, moral and ethical issue. Responsibility in the role, causal responsibility, rebuke, and legal responsibility are different approach to understand the meaning of responsibility in ICT, which are discussed in this paper. The important question of ...

  13. Issues and Challenges in The Use of Information Communication

    ICT has given rise to a host of legal and ethical issues and challenges in the use of ICT for education. Pre-service and in-service teachers as well as students need to know to a reasonable extent about the issues and challenges in the use of ICT for education. As teachers or potential teachers and students, they need to be above reproach.

  14. Legal and ethical issues of using ICT

    This leads to problems like plagiarism and copyright violation. Many users worry that others will misuse their computers and might steal their data to commit fraud. These are some of the legal and ethical issues related to using ICT and teachers need to have a reasonable amount of information about these issues. 21.2 COPYRIGHT.

  15. Case Studies

    Case Studies. More than 70 cases pair ethics concepts with real world situations. From journalism, performing arts, and scientific research to sports, law, and business, these case studies explore current and historic ethical dilemmas, their motivating biases, and their consequences. Each case includes discussion questions, related videos, and ...

  16. PDF The Ethical and Social Issues of Information Technology: A Case Study

    Abstract—The present study is conducted among 283 students from University of Zabol to identify the harm and ethical and social issues in the field of information technology and to classify the immoral practices that students are doing in this field. First various important issues in the field of IT in the social and ethical areas are discussed.

  17. PDF UNIT 16 ICT: SOCIAL, LEGAL AND Classroom ICT for Inclusive ETHICAL ISSUES

    77. ICT: Social, Legal and Ethical Issues. ICT phobia is often used in the sense of an unreasonable anxiety.ICT phobia is related with the anxiety about learning with computers or not being able to learn effectively using computers. This is basically to avoid fear of learning new skills mandatory for use of computers in the school or workplace.

  18. PDF Ethical Issues in Use of Ict at Higher Education

    made questionnaire of 20 items to assess the ethical issues of using ICT in higher education. The first step of constructing a scale was the preparation of blueprint of the questionnairethe . All the aspects of ethical issues of ICT were taken into consideration. At the present study, investigators

  19. PDF Legal and Ethical Issues in Educational Technology

    legal and ethical issues to the attention of educators. Therefore, this chapter exam-ines the critical concepts related to legal and ethical issues in educational technol-ogy. As Lagola (2021) argued, educating teachers and parents on the substance of these issues is vital because they can also protect themselves as they model proper

  20. PDF LEGAL AND ETHICAL ISSUES IN ICT

    • There is no legal right to copy other people's work or make it public on the internet. • As information technology becomes increasingly influential, the ethical and legal considerations become similarly relevant. • Advances in communication technology have outpaced the development of accompanying legal standards and ethics codes.

  21. Business and the Ethical Implications of Technology ...

    While the ethics of technology is analyzed across disciplines from science and technology studies (STS), engineering, computer science, critical management studies, and law, less attention is paid to the role that firms and managers play in the design, development, and dissemination of technology across communities and within their firm. Although firms play an important role in the development ...

  22. Legal and ethical issues of using ICT

    This leads to problems like plagiarism and copyright violation. Many users worry that others will misuse their computers and might steal their data to commit fraud. These are some of the legal and ethical issues related to using ICT and teachers need to have a reasonable amount of information about these issues. 21 COPYRIGHT

  23. Ethical Issues in Information Technology (IT)

    The second aspect of ethical issues in information technology is access right. Access right becomes a high priority issue for the IT and cyberspace with the great advancement in technology. E-commerce and Electronic payment systems evolution on the internet heightened this issue for various corporate organizations and government agencies.