background image for GxP Lifeline
GxP Lifeline

Augmented Intelligence Helps Clarify Human/AI Roles in the Workplace


For decades, futuristic science fiction movies have often included some version of artificial intelligence (AI). Despite variations in voice tones, personalities, physical aspects, etc., the creators of these enigmatic beings seem intent to have them mimic human qualities. While audiences were watching AI entities and robots come alive on the big screen, scientists were busily researching and developing the real thing. Naturally, the notion of AI triggers concerns about how all that silicon and circuitry will change the workplace — and life in general. Opinions on the matter are varied, but some experts are saying not to expect a dystopian society anytime soon.

AI: A Pop Culture Perception

Over the years, sci-fi movie makers have depicted AI in different variations of physical form, mannerisms and purpose. A common premise is the AI entities are created to perform specific tasks — and under no circumstances are they to deviate from said tasks. The AI beings usually follow this directive to a fault, which, of course, is what movie plots are made of. In an effort to alleviate fears of the unknown or the prospect of AI superseding its role among humans, biochemistry professor and avid science fiction writer Isaac Asimov attempted to put up some guardrails around AI, which he summarized in his Three Laws of Robotics:(1)

  • A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  • A robot must obey the orders given it by human beings except where such orders would conflict with the first law.
  • A robot must protect its own existence as long as such protection does not conflict with the first or second laws.

Asimov intended these parameters to give some clarity to AI’s role in society. Still, not being one to back down from a challenge, Hollywood set out to demonstrate all the different ways Asimov’s laws could be broken:(2)

  • HAL. Let’s start with one of the most studied, analyzed, imitated, and parodied sci-fi films to date, Stanley Kubrick’s “2001: A Space Odyssey.” The sentient computer, ship steward and reigning chess champion, HAL (Heuristically Programmed ALgorithmic Computer) gave AI a voice and something of a face.(1) Naturally, HAL malfunctions during a space mission, but refuses to accept that it’s capable of making mistakes. When the crew try to overtake HAL and bring it offline, it begins to wreak havoc. Yet, even while it goes about systematically eliminating the crew, it maintains its congenial, monotone voice: “I’m sorry, Dave, I cannot obey your orders. I’m contractually obligated to follow the script.”
  • “I, Robot.” This story, which was actually penned by Asimov and later made into a movie starring Will Smith, who has a lot of experience combating nonhuman foes (another topic for another day). This movie looks at the day-to-day relationships of humans and robots based on the AI’s adherence to Asimov’s Three Laws. However, a primary concept of AI is its ability to keep learning as it continues to gather and process more data. As the story unfolds, the AI beings start to realize that there is more to their existence than what falls within the confines of the Three Laws and initial programming. They develop the belief that humans are clearly a self-destructive bunch and the very survival of the human race depends on the robots usurping their power.
  • Cyberdyne Systems Model 101. No discussion of artificial intelligence in films is complete without mentioning “The Terminator.” This movie doesn’t even attempt to stay inside the boundaries of Asimov’s Three Laws. The scenario of this movie follows the experiences of a cyborg that has been dispatched to travel back in time for the express purpose of bringing harm to a particular human in order to preserve its species’ post-apocalyptic society. The immortal catch phrase “I’ll be back” was almost certainly meant for the viewing audience, indicating that more Terminator movies would follow.

In all cases, it’s the cunning wit and resourcefulness of the human protagonists that prevail over the rogue AI beings. Given the abundance of movies featuring AI, it’s a wonder why there isn’t an Academy Award category for best performance by an anthropromorphized computer system.

AI: The Real World

In the mid 1950s (back in the real world), computer scientists developed AI programs that could best their human counterparts at a game of checkers. While that might be a fun activity at parties, it was assured that the vision of this new artificial intelligence technology went far beyond board games.(3)

AI is defined as the simulation of human intelligence processes by machines. The processes include acquiring information, determining how to analyze and use the information, and self-correcting upon receiving new data. AI algorithms have a programmed ability to continue learning and adapting based on the data it receives, making it highly useful for processes involving the analysis of enormous amounts of data.

The technology is already being applied in the financial sector to sort, analyze and summarize data to help with developing investment strategies. AI is also being put to use in the life sciences industry:

  • Apple Watch for atrial fibrillation – The wearable device’s functionality detects irregular heart rate, which is considered to be a leading cause of stroke and hospitalization in the US.(4)
  • Viz.ai stroke detection application – Viz.ai’s software links with CT scanners to identify and triage potential large vessel occlusion (LVO) strokes and can automatically notify specialists and transmit the radiological images to a health care provider’s smartphone.(5)

Cutting to the Chase: Will a Computer Take Your Job?

We can put aside any concerns of being subjected to AI-ruled indentured servitude. Still, the technology will most likely become more integrated in the workplace, but not in a displacement-of-humankind sort of way. A statement by Timothy Miller, associate professor at the University of Melbourne, helps sum up and possibly assuage concerns about AI and its impact on the workforce: “Instead of asking what will computers not be able to do, we should ask what will we simply not want them to do?” The following are a few examples:

  • People like to innovate and create and don’t want to hand those roles over to technology.
  • Physicians and clinicians would like to have more time to interact with patients. Again, another activity best left to humans.
  • Lab technicians would like to spend less time manually compiling and analyzing data and more time using it to make discoveries and progress.
  • Organizational leaders would like to have every bit of the most current and relevant data in a format and language that enables them to make fast, intelligent decisions.

We welcome technology that gives us the ability and time to deliver what brings value to people and society.  Turning over repetitive, time-consuming and arduous tasks to automated technology allows humans to be creative and put more effort into innovation and problem-solving. As it is, the aspirations for AI are still pretty lofty. The technology is nowhere near being able to take on complex tasks such as making judgements and decisions for every possible scenario.

Miller went on to say that computers or robots will never possess the ability to empathize and feel emotion. Also, humans are capable of calling an audible when needed and rationalizing their decisions — algorithms have no concept of rationalization. Also, we don’t want computers deciding which problems are important for us.(6)

Augmented Intelligence

In essence, AI will actually be creating jobs. The workforce is evolving, and new generations of workers have more technology-based aptitudes. The skill sets of the emerging workforce are inherently geared more for technology, creativity and innovation. The workplace will have a different look and feel only because technology will allow companies to benefit from the contributions of humans working in concert with technology.

A more fitting term for AI might be augmented intelligence. This is a concept of AI technology that focuses on AI’s assistive role, emphasizing the fact that cognitive technology is designed to enhance human intelligence rather than replace it. AI is indeed changing industry and society. Going forward, there will be new paradigms in education, human creativity will flourish and workplaces will reestablish collaboration, which is sorely lacking in today’s workforce.(7)


References

    1. Biography, “Isaac Asimov.” Retrieved from https://www.biography.com/writer/isaac-asimov
    2. The Guardian, “The Top 20 Artificial Intelligence Films - in Pictures.” Retrieved from

      https://www.theguardian.com/culture/gallery/2015/jan/08/the-top-20-artificial-intelligence-films-in-pictures

    3. A Very Short History Of Artificial Intelligence (AI)

      https://www.forbes.com/sites/gilpress/2016/12/30/a-very-short-history-of-artificial-intelligence-ai/#6afd88506fba

    4. Study Shows Apple Watch Helps Detect Atrial Fibrillation

      https://www.medicaldevice-network.com/news/apple-watch-atrial-fibrillation/

    5. Medtronic to Distribute Viz.ai’s Stroke-Spotting AI Imaging Software

      https://www.fiercebiotech.com/medtech/medtronic-to-distribute-viz-ai-s-stroke-spotting-ai-imaging-software

    6. Will a Computer Take Your Job?

      https://pursuit.unimelb.edu.au/articles/will-a-computer-take-your-job

    7. Forbes, “Three Things You Need to Know About Augmented Intelligence” https://www.forbes.com/sites/danielaraya/2019/01/22/3-things-you-need-to-know-about-augmented-intelligence/#12853c5f3fdc

2019-bl-author-david-jensen

David Jensen is a content marketing specialist at MasterControl, where he is responsible for researching and writing content for web pages, white papers, brochures, emails, blog posts, presentation materials and social media. He has over 25 years of experience producing instructional, marketing and public relations content for various technology-related industries and audiences. Jensen writes extensively about cybersecurity, data integrity, cloud computing and medical device manufacturing. He has published articles in various industry publications such as Medical Product Outsourcing (MPO) and Bio Utah. Jensen holds a bachelor’s degree in communications from Weber State University and a master’s degree in professional communication from Westminster College.


Free Resource
MasterControl Validation Strategy FAQ

Enjoying this blog? Learn More.

MasterControl Validation Strategy FAQ

Download Now
[ { "key": "fid#1", "value": ["GxP Lifeline Blog"] } ]