A small mobile microscope fits easily into a suitcase, while the cost of its components is comparable to that of a smart phone. The device takes digital scans of specimens smeared on a glass plate, after which the digital images can be sent anywhere for analysis through mobile networks.
And the one doing the analysis does not have to be human.
Computer vision enables specimen diagnostics based on artificial intelligence. What this means in practice is that computers are taught to identify deviations present in image datasets. Malaria parasites, for example, appear as spots in red cells included in blood samples.
A group led by Johan Lundin, working at the Institute for Molecular Medicine Finland (FIMM) at the University of Helsinki, is investigating and developing devices and applications suitable for point-of-care diagnostics.
According to Lundin, their objective is to make it easier to carry out diagnostics particularly in regions where experts are scarce.
Promising results have already been had with diagnosing infections caused by parasites common in developing countries. Currently, the mobile microscope and its potential in remote diagnostics are being tested for diagnosing schistosomiasis in a Tanzanian school.
Alongside malaria, schistosomiasis, or bilharzia, is one of the most common parasitic illnesses. Its diagnosis is relatively uncomplicated, but in the future, computer vision can be harnessed for increasingly complicated tasks. This is called deep learning.
In the future, computer vision can be harnessed for increasingly complicated tasks.
With the help of biobank data, Lundin’s group is currently working on an algorithm that will identify cellular changes in Pap smear specimens. Tests will be carried out in Kenya already this year.
In Africa, there is only one pathologist per one million citizens on average, which makes screening for premalignant conditions in the uterine cervix rare. This has led to delayed cancer diagnoses, while cervical cancer is the most common cause of cancer deaths in African women.
Cancer treatment efficacy increased by data mining
Several research projects currently under way at the University of Helsinki aim to develop increasingly effective drug therapies with the help of artificial intelligence. They are working on algorithms that help choose the right drug for the right patient. In other words, artificial intelligence is used for achieving more personalised therapies.
Artificial intelligence is used for achieving more personalised therapies.
Drug efficacy is affected by several individual factors, which can complicate the selection of the most suitable drug or combination of drugs. Researchers are studying genetic differences between patients in order to understand why a certain drug is effective on some patients and not on others. While the production of data related to factors affecting drug efficacy on the molecular level is no longer an issue, reliable and effective collection and analysis of this data remain problematic.
Jing Tang of FIMM leads a research project with the aim to develop a computational method for providing personalized cancer drug recommendations. The project is also a finalist team in the Helsinki Challenge competition.
In addition to genetic data, Tang is utilising results from cancer drug tests carried out ex vivo. In these tests, the efficacy of cancer drugs is tested in a laboratory on cancer cells extracted directly from clinical specimens. This method makes it possible to test many, even hundreds of drugs with very small drug amounts.
Computers enable, humans decide!
At the moment, only one in four cancer drugs lead to the desired treatment outcome. Precious time needed for healing is wasted in the hunt for suitable medication.
Also statins, used among other things as cholesterol drugs, are hampered by their ineffectiveness and adverse effects – one in four patients stops using statins within a year even though this course of treatment is intended as permanent. High cholesterol is a significant risk factor associated with cardiovascular diseases.
An algorithm for statin selection, predicting the efficacy of the drug and its adverse effects, is also under development.
Final treatment decisions will always be made by human experts.
The use of artificial intelligence in choosing the right drug therapy has not yet been clinically tested, but this is one of the goals in the coming years. However, final treatment decisions will always be made by human experts, or doctors, researchers stress.
Instead of or in addition to human contact?
Also social psychologists are interested in the consequences of the increasing presence of technology in human life.
“Consideration is often given to the decrease in human contacts caused by technology becoming increasingly common. The same consideration could be given to the potentially positive effects of technology for people who lack human interaction,” says Niklas Ravaja, university researcher at the Helsinki Collegium for Advanced Studies.
Ravaja is heading a research project where social psychologists from the University of Helsinki and researchers from Aalto University are studying social interaction and emotions in virtual reality.
Researchers are studying social interaction and emotions in virtual reality.
“It is interesting to find out what happens in virtual reality and how it differs from face-to-face interaction,” muses Ravaja.
The project utilises neuroadaptive virtual reality that adapts to the neurophysiological activity of its users.
In practice, bodily functions of study subjects, such as the electroencephalogram, heart beat and breathing will be monitored during exercises completed in virtual reality. This virtual reality will simultaneously adapt to measurement results.
“Our founding principle is that neurophysiological measurements tell something about the emotional and cognitive state of humans,” says Rajava.
The exercises that study subjects are asked to complete can be used, for example, to produce feelings of empathy in virtual reality. Empathy, linked with mental wellbeing, can be detected on an electroencephalogram.
Research designs used will enable the study of regularities in interaction. At the same time, potential applications of virtual reality are being considered. According to Ravaja, mindfulness exercises carried out in virtual reality may help recovering from work-induced stress. They can also be used to treat phobias, or as part of online therapy.
Human touch is really difficult to simulate.
The feeling of absorption can be increased further with a technology-mediated glove. This enables tactile sensations.
“But genuine human touch is really difficult to simulate,” admits Ravaja.
A robot’s touch with more precision and softness
University researcher Jukka Häkkinen is also interested in simulating human touch. To this end, he and his group have developed a system that measures human touch with the help of infrared, visible spectrum and depth-sensing cameras.
“They enable the capture of touch impressions, which can then be added to a three-dimensional model used for teaching robots,” says Häkkinen.
The Grasp Sense project, led by Häkkinen, will be on display at the University of Helsinki stand at Slush in the beginning of December.
Training in taking a hold of something and touching is necessary for both logistics robots sorting packages and the nursing care robots of the future. At people’s homes, robots will be required to handle complicated and unpredictable environments, while their touch cannot feel unpleasant.
“Interaction with humans is difficult to realise if support and help given by a robot does not feel pleasant and natural,” says Häkkinen.
How does the idea of a robot as a carer feel like?
This article is part of University of Helsinki's 2017 People in Change science theme.