Skip to main content


Artificial General Intelligence and A Human-Robot Partnership

Harness your data with a data strategy

The famous example that would be given about artificial general intelligence (AGI) is when a autonomous car has the rational ability to make a strong decision of what direction to take. Take an example of a car driving in a lane autonomously. It has to make 1 of 2 options in an accident prone path. Option 1: Drive straight to a walking team of 4 pedestrians and hit them or swerve the car with 1 passenger on it and hit that person. Which answer would it pick? AGI.

Or take very simple examples, a little baby can learn what a dog is by seeing 4-5 real time images of a dog. A person seeing a blind crossing the road and willing to help. An old man walking down the street and offered assistance to walk with. Building a conversation of total strangers in some topics. All these categories falls in intelligence that human brain has a unique response that cannot be simulated by robots in the near foreseen decade at least.

Hypothetically, let us take that AGI has been successfully created using robots and it is being provided to a small mom and pop pizza store as an order taker and a pizza creator. Let us take all these unique situations that AGI cannot handle that needs a human touch:

  • Someone is choking on a artichoke and needs help
  • Someone is ordering a pizza without a crust (naughty!!)
  • Someone is ordering a pizza with 5 crusts (naughty!!)
  • Someone is ordering too much sauce on their pizza
  • Someone spilled water on the floor and needs assistance
  • The pizza robot tripped and fell down and could not get back up

I can keep going. Now you understand there is no AGI without a human intervention.

This is the reason why AGI should never be considered a trait of a robot to be fulfilled. Instead, a partnership area where humans and robots are trained alike to co-exist. So consider the same example above. An old man or a blind man walking the street is assisted by robots to increase his health. In fact, the Smart care support system for blind people have been in existence for a while. However, they are becoming smarter by optimizing the walkway for them and will eventually connect into the traffic lights for automatic detection of incoming walkers. This cane can recognize people (facial recognition) vs objects and can indicate what they are looking into.

Ask us about our AI Literacy and Data Literacy support system which will help you identify your job losses and your new job opportunities that we can help identify in your organization as your Journey to the AI.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Arvind Murali, Chief Data Strategist

Arvind Murali is the Chief Data Strategist for Data Governance with Perficient. His role includes defining data strategy and governance to deliver transformative data platforms. Arvind has served as an executive advisor for data strategy and governance to organizations across several industries. Arvind’s dedication to solving challenges and identifying new opportunities has provided valuable business-focused results for clients, such as providing self-service access to data for global sales teams; helping physicians create informed wellness plans; and delivering insights about current supply chain inventories. He is a passionate Vlogger on YouTube and discusses real-world insights, data platform trends, and the importance of governance as big data continues its exponential growth.

More from this Author

Follow Us