Everywhere you turn these days, you’ll notice technology taking over the way we do everyday things. Who could have imagined 40, 30, or even 20 years ago the advancements technology has now made, to the point that most people have a device that contains all the information they need to know, instantly?
The more time passes, the more advanced it comes – and especially when it comes to health. From fitness trackers, heart rate monitors, and even application that can check your skin for signs of cancer, it just keeps getting more sophisticated.
And now, Apple is looking to hire an engineer with a background in psychology to help make Siri, the intelligent personal assistant of Apple products, a better psychologist.
In the job ad posted to Apple‘s jobs website, it explains how more and more people are turning to Siri in times of crisis, having “serious conversations” with the artificial intelligence.
“People talk to Siri about all kinds of things, including when they’re having a stressful day or have something serious on their mind,” the ad explains.
“They turn to Siri in emergencies or when they want guidance on living a healthier life.”
In addition to a computer science degree with over five years of experience, the company is looking for an an applicant with a peer counselling or psychology background to improve Siri.
Apple have been taking more of an interest in the health and wellbeing of their consumers of late, with their latest watch having the ability to tell you if you’re having heart problems.
However, mental health issues are an area many devices with AI capabilities have neglected, according to a Stamford University study last year.
The study found that when asked simple questions about mental health, rape and domestic violence, phone assistants responded “inconsistently and incompletely”.
“Depression, suicide, rape and domestic violence are widespread but under-recognised public health issues,” said Eleni Linos, MD, DrPH, an assistant professor at UCSF and senior author of the paper.
“This is a huge problem, especially for women and vulnerable populations. Conversational agents could be a part of the solution. As ‘first responders,’ these agents could help by referring people to the right resources during times of need.”
One thing’s for sure: any advancement in ways people can seek help for mental health issues is a good thing.