• Published on: Jun 20, 2020
  • 4 minute read
  • By: Dr Rajan Choudhary

Artificial Intelligence In Healthcare

  • WhatsApp share link icon
  • copy & share link icon
  • twitter share link icon
  • facebook share link icon

Artificial intelligence. This phrase means different things to different people. To some, it conjures ideas of robots having the same intelligence and creativity as humans, able to do any tasks we instruct them, except better than us. To others it is a new and exciting tool, one that could revolutionise the way we work, but also the way labour is distributed in society. And for developers? They dread being asked to make an artificial intelligence system by people who have only heard buzzwords such as “machine learning” and “deep neural network” in headlines and blogs.

In this blog we will look at the basics of AI terminology, so we can understand what these terms really mean, and whether they will have an impact on healthcare.

WHAT IS ARTIFICIAL INTELLIGENCE?

Even this question is difficult to answer, as it enters the realms of philosophy and discussion over the meaning of intelligence. What makes a person intelligent? Is it their retained knowledge? Because a computer can store the entirety of known human knowledge on a disc. Is it understanding and following instructions? Or is it creativity, a skill even the average person may struggle with at times. We know one thing for sure, distilling a person’s intelligence down to a single IQ number is disingenuous and doesn’t represent true intelligence.

Similarly people define AI in different ways. A broad definition looks at the ability for a computer or programme to be able to respond autonomously to commands, to the changing environment around them, recognise audio or visual cues, process the information without strict defined rules and spit out a desired function.

The key features appear to be autonomy: the ability to function independent of a human controller or guide, and adaptability: the ability to work beyond strict rules and criteria, and function in situations or with inputs beyond their original programming.

In medicine, a “dumb” system could work with physical values, for instance blood results, compare them to a “normal range” and determine if results are abnormal (e.g if the patient has anaemia).

A smart “AI” would be able to look at a CT scan, notice subtle changes in the images, compare it against what a normal scan should look like, and identify the pathology. This is very difficult because normal scans can differ noticeably between patients, (for instance due to anatomical differences between people), and disease findings can be even more varied, unusual, abnormal. Human brains have incredibly complex pattern recognition systems – over a third of the human brain is dedicated to just visual processing. Imagine trying to re-create that in code.

At first people tried to emulate this with fixed programming. For instance, to teach a programme to recognise a bicycle, you would need to teach it to first exclude anything that is not a vehicle, then exclude anything that does not have wheels, has more than 2 wheels, has a frame connecting the two wheels, has a chain connecting the pedals and the rear wheels……and so on. All of this for a bike. Now imagine trying to code it to recognise subtle changes to cells under a microscope, to recognise cancer cells, to recognise an abnormal mass on a scan. Clearly this solution is very clunky, and simply not feasible.

MACHINE LEARNING

Modern AI systems have moved towards “machine learning”. This is a statistical technique that fits learnt models to inputted data, and “learns” by training models with known data sets. Instead of a person defining what a bicycle is, the model is flooded with thousands of pictures of bikes, and the programme forms its own rules to identify a bike. If this model is then shown a picture of a bike it will show the statistical likelihood of the picture being a bike. The system could be expanded by  further training the model with pictures of motorbikes, scooters and other two wheeled forms of transport. Now if given a picture, the model can determine what type of two wheel transport it has been shown.

The healthcare application can be simple – lets look at a radiology example.  Teach an AI model what normal lungs look like, then show it images of various pathologies such as pneumonia, fibrosis or even lung cancer. If fed enough images and variations of a type of disease, the AI’s statistical analysis might even find associations and patterns to identify a disease that a human radiologist would be unable to find.

NEURAL NETWORKS

A more complex form of machine learning is the neural network. Its name suggests it is analogous to the neurons in a human brain, though this analogy does not stretch much further. Neural networks split the image into various different components, analyse these components to see if it has variables and features before spitting out a decision.

The most complex forms of machine learning involve deep learning. These models utilise thousands of hidden features and has several layers of decision making and analysis before a decision is made. As computing power increases, the ability to create ever more complex models that can look at more complex 3 dimensional images full of dense information. These deep learning models have been able to identify cancer diagnoses in CT and MRI scans, diagnoses that have been missed by even the most expert consultants. They can also identify structures and patterns the human eyes cannot, and may end up being better at diagnoses than a highly trained specialist. Of course such diagnoses would still have to be checked by a doctor, as due to the medico-legal implications that could occur from incorrect diagnoses created by a computer utilising models even their programmers cannot understand.

NATURAL LANGUAGE PROCESSING

But the application of AI is not limited to identifying images and scans. One of the greatest hurdles a computer faces is trying to understand human speech. Dictation from speech to text is easy, but understanding the meaning of what was said, and trying to use that to create instructions or datasets, that’s hard. This is why the iPhone’s Siri or Google Assistant on Android phones seem so limited. They can only recognise certain set instructions such as “What is the weather” or “Set an alarm for…”. More complicated instructions or requests usually results in an error.

People don’t speak in simple sentences. If asked about their symptoms, every patient will use different sentence structures, adjectives, prioritise different symptoms depending on how it affects them, and create a narrative rather than a list of symptoms. Similarly when writing in patients notes, doctors will also use complex sentences, short-hands, structure their notes differently. Feeding this information to Siri would not output a clear diagnosis, but rather give the poor digital assistant a migraine.

Deep learning is being used to analyse natural speech to pick out the important information that will lead to a diagnosis, similar to how a medical student is trained when taking a history. If deployed successfully this would be invaluable in triaging patients based off the severity of their symptoms, and assigning them to the right specialists. 

It would also have huge implications for research. Identifying data is very labour and time intensive, and the costs of trawling through patient notes can significantly limit the feasibility of research studies. A deep learning AI system could read through the notes, identify all the important symptoms, how a patient is improving on a day to day basis and other subtle parameters, and do so without human supervision through thousands of cases without boredom or fatigue. The wealth of information available could significantly improve the quality of research performed.

Artificial Intelligence and the various buzzwords can be difficult to break down and digest. And certainly this blog will not answer all of your questions, and may leave you with more questions than you started with. But understanding the basics of AI will help in appreciating the effort that goes into creating these systems, and also acknowledge the hurdles that limit AI from becoming prevalent across healthcare.

At least for now. Progress in this field is constant. By next year the AI landscape may be very different.

Dr Rajan Choudhary

HEAD OF PRODUCTS, SECOND MEDIC INC UK

Read Blog
Chronic Disease Management in Digital India: How SecondMedic Is Transforming Long-Term Care

Chronic Disease Management in Digital India: How SecondMedic Is Transforming Long-Term Care

In India, chronic diseases are the silent epidemic. From diabetes and hypertension to COPD and heart disorders, these conditions affect millions - and demand long-term, consistent care.

Traditionally, managing these illnesses meant frequent hospital visits and reactive treatment. But in Digital India, technology has changed the game. Platforms like SecondMedic are making chronic care predictive, preventive, and personalized.

 

The Chronic Disease Burden in India

According to the World Health Organization (WHO), chronic diseases account for over 60% of deaths in India.
The Indian Council of Medical Research (ICMR) reports that:

  • 1 in 4 Indians suffer from a chronic condition.

  • 77 million people are diabetic.

  • 220 million live with hypertension or cardiovascular risk.
     

The challenge? Managing these conditions continuously - not just during hospital visits.

 

How Digital Transformation Is Changing the Game

The rise of digital healthcare - teleconsultations, remote monitoring, and AI analytics - has turned chronic care into an ongoing, data-driven process.

Predictive analytics, powered by AI, identifies early warning signs and suggests interventions before crises occur.
Wearable devices track vital parameters like heart rate, oxygen, glucose, and BP 24×7.
Cloud-based health records allow doctors to review trends remotely and adjust treatment instantly.

A NASSCOM Digital Health Report (2024) notes that remote monitoring adoption has increased by 68% since 2020, saving up to 25% in hospitalization costs.

 

How SecondMedic Makes Chronic Care Smarter

SecondMedic combines medical expertise with cutting-edge technology to empower patients:

  • Remote Doctor Consultations - Regular virtual follow-ups for chronic patients.

  • AI-Powered Health Dashboard - Smart algorithms detect risk trends and trigger alerts.

  • Lab & Diagnostic Integration - Automatic syncing of test results for doctor review.

  • Personalized Health Plans - Tailored diet, exercise, and medication guidance.

  • Continuous Monitoring - Devices and data integration for real-time oversight.
     

This holistic approach ensures proactive management - keeping patients healthier and reducing the chance of emergencies.

“Digital tools have allowed us to shift from managing illness to maintaining wellness.”
- Dr. Meenakshi Sharma, Medical Director, SecondMedic

 

Real-World Impact & Market Insights

  • Market Growth: India’s chronic care management market is projected to reach USD 11.2 billion by 2030, growing at CAGR 12.5% (IMARC Group 2025).

  • Digital Adoption: 74% of doctors use digital tools to monitor chronic patients remotely (FICCI HealthTech Survey 2025).

  • SecondMedic Data: Users enrolled in chronic care programs show 28?wer hospitalizations and 40?tter treatment adherence.
     

 

Challenges Ahead

Despite progress, India faces key hurdles:

  • Limited digital literacy among elderly patients.

  • Unequal internet access in rural regions.

  • Need for regulatory clarity on remote prescriptions.

  • Integration between hospital and home-based care systems.
     

But with the Ayushman Bharat Digital Mission (ABDM) and telemedicine policy frameworks, these challenges are rapidly being addressed.

 

Conclusion

Chronic disease management in Digital India is not about occasional care - it’s about continuous connection.
With platforms like SecondMedic, chronic patients can now access doctors, diagnostics, and AI health tracking - all from the comfort of home.

Healthcare is no longer reactive - it’s proactive, predictive, and personal.

Take control of your health today at www.secondmedic.com

 

Real Data & References

See all

Live Doctor consultation
Live Doctor Chat

Download Our App & Get Consultation from anywhere.

App Download
call icon for mobile number calling and whatsapp at secondmedic