Technology and mental health: The role of artificial intelligence

  • Home
  • blog
  • Technology and mental health: The role of artificial intelligence
blog image

workers, which authors attributed to heightened awareness of
pathological thought and drinking behaviour [9].
4.2. Accessibility, de-stigmatisation and personalisation
In 45% of the world, there is less than one psychiatrist to every
100,000 people yet over 50% of the world population now owns a
smartphone. A single AI system can be used by very large
populations, provided they have access to the required hardware
which is increasingly ubiquitous. Such technology could increase
accessibility by reducing lengthy and costly travel to centrally-
located mental health clinics. It may enable over-burdened mental
health professionals to increase the reach of their services. It could
also help people whose conditions restrict their ability to travel,
such as GAD, agoraphobia or physical health problems.
AI may also increase accessibility by circumventing some of the
stigma surrounding mental illness. For example, chatbots may
avoid such stigma as they are not part of a wider social construct
with all the associated cultural norms and expectations. They are
more likely to be perceived as non-judgmental, non-opinionated
and overall neutral.
AI may enable greater personalisation of care. For example,
more sophisticated analysis of large volumes of data may enable
better prediction of therapeutic response and side effect profile
from different medications. This would reduce the need to perform
multiple trials of different medications.

  1. Clinical governance
    The regulation of mobile apps is significantly less than the
    regulation of medical products and treatments. For example, over
    47,000 mental health apps were on sale to US consumers in 2015.
    The majority of these had not been validated, and of the few that
    had it was primarily by small-scale, short-term pilot studies. The
    use of unvalidated smartphone apps poses risk to patients due to
    poor quality information and potentially harmful recommenda-
    tions. Many of the apps available for mental health may not be
    classed as ‘medical products’ as they focus more on self-
    management and lifestyle
    Clinical governance in this area is important to prevent
    unregulated use and potential harm. In the UK, the NHS has made
    efforts to introduce this by creating the “NHS Digital Apps Library”
    which includes ‘NHS Approved’ labels for those which sufficient
    evidence of effectiveness and safety.
    It should not be assumed that new technologies will definitely
    have a positive effect and they can plausibly cause harm. For
    example, while some feel more empowered to control their illness
    by using passive monitoring, others may feel overwhelmed by the
    additional responsibility, or perceive it as a constant reminder that
    they are ill [10]. Some technologies may benefit certain sub-
    populations yet be harmful to others and these distinctions must
    be borne out by thorough research.
  2. Conclusion
  3. Re-balancing clinician workload
    AI may drastically re-balance a clinician’s workload, giving
    them more time to interact with patients, thereby improving the
    quality of care. This is particularly true within psychiatry because it
    is a text-heavy field. Psychiatrists spend lots of time reading
    previous notes to build an accurate picture of a patient’s history.
    Natural Language Processing (NLP) is an area of AI which involves
    analysing human language for meaningful information. NLP could
    be used to summarise the most important data from a patient’s
    electronic health records, providing a succinct summary at the
    beginning of a consultation. For example, a timeline of mental state
    examinations and the associated treatment regime could be
    produced. NLP could also provide more powerful analysis if
    combined with AI analysis of speech and video to provide a short
    summary of a patient’s mental state, which could be a useful
    supplement to the psychiatrist’s mental state examination. Such an
    algorithm would be objective and not at risk of inter-clinician
    variability.
    The successful integration of AI into healthcare could dramati-
    cally improve quality of care. In psychiatry, new tools for diagnosis,
    monitoring and treatment may improve patient outcomes and re-
    balance clinician workload. While there is great potential,
    numerous risks and challenges will arise. These will require
    careful navigation to ensure the successful implementation of this
    new technology.
    Conflicts of interest
    Christopher Lovejoy and Varun Buch are employees of Cera
    Care. Mahiben Maruthappu is an investor and employee of Cera
    Care. Cera Care is a domiciliary care provider conducting research
    into how artificial intelligence can be used to improve the care
    delivered to elderly people living at home. This work received no
    specific grant from any funding agency in the public, commercial,
    or not-for-profit sectors.
    References
  4. Data security, privacy and consent
    Healthcare data is sensitive and mental health data is
    particularly so given the high risk of stigmatisation and
    discrimination in the case of data disclosure. For the public to
    embrace this new technology, which is necessary for it to be
    effective, the healthcare profession must earn their trust. There
    have been high profile cases involving misuse of personal data,
    such as the Facebook-Cambridge Analytica scandal. Similar events
    within healthcare could have serious consequences.
    Mental illness can affect capacity and thus the ability to provide
    consent. Presence of capacity can fluctuate over time. A patient
    may initially give consent for passive monitoring, for example, but
    if they later lose capacity due to worsening mental health then it is
    unclear whether the patient’s initial consent remains valid. AI
    would also require patients to consent to much greater amounts of
    data; those who consent to medical notes may not so easily
    consent to video, audio and other forms of data being stored.

Leave a Reply

Your email address will not be published. Required fields are marked *

Translate »