Stephen Schueller on the Power and Promise of Mental Health Apps

Stephen Schueller on the Power and Promise of Mental Health Apps

by Lawrence Rubin
Mental health apps offer many promises, but there are warning signs to be heeded.

PSYCHOTHERAPY.NET MEMBERSHIPS

Get Endless Inspiration and
Insight from Master Therapists,
Members-Only Content & More


 

Mental Health Apps 101

Lawrence Rubin: Thanks for joining me today, Stephen. I first became familiar with your work when I took a deeper dive into mental health apps and came across your work with One Mind PsyberGuide, a system for evaluating these tools. For those of our readers who may not yet be familiar with or worked with them personally or professionally, can you define a mental health app? 
Stephen Schueller:
A mental health app is essentially a software program that can support people in their mental health journeys
A mental health app is essentially a software program that can support people in their mental health journeys. There are various kinds of mental health apps, with estimates suggesting that there are somewhere between 10,000 to 20,000 of them out there. Some of them are intended to be used on their own, so a consumer might use a product to self-manage facets of their own condition, like anxiety, depression, or trauma. And others are really meant to be used in conjunction with standard therapy.

So, for example, the Veterans Administration and the Department of Defense have developed a suite of different apps that are designed as adjuncts to standard evidence-based treatment. For example, CPT Coach for cognitive processing therapy. PTSD Coach for PTSD treatment. PE Coach for prolonged exposure. These are meant to be tools that help support a therapist and a client who are engaged in a specific type of treatment, like prolonged exposure or cognitive processing therapy.  
LR:  Are the apps themselves subjected to the same type of empirical validation standards as the therapies they are adjunctive to?
SS: I think it is an appropriate question to ask. To consider what level of evaluation is needed depends on the type of product, the type of app. Those apps that are meant to be therapy adjuncts for example, are designed to replace worksheets or other supplemental content that would go along with an established evidenced-based treatments. Cognitive Processing Therapy Coach, developed by the VA and DOD, is meant to support cognitive processing therapy. Its various homework assignments, tracking components, and capacity to record the actual sessions so that clients can listen to them later and do some of the exposure exercises, all get done in the context of the app. And so, to the same degree that you probably don’t need to evaluate every new version of a worksheet associated with an established treatment protocol, you don’t need to undergo the same types of rigorous evaluations as you would do to the treatment itself.

As opposed to apps that are therapeutic adjuncts, there are those that are meant to be more treatments unto themselves. And if they’re not some type of formal treatment like the ones I mentioned, they might be like self-help or self-management products, which opens some interesting questions. Like if these are replacing the self-help books of the past, do we need an evaluation of every single self-help book out there? Or is it sufficient that a self-help book aligns with evidence-based treatments and evidence-based principles if it does not have a formal evaluation?

And so, I think for these adjunctive apps, it’s important to distinguish between direct and indirect evidence. Direct evidence would entail an evaluation of the app itself that explores whether it has been subjected to clinical research studies that show effectiveness for the target condition or goal that that app is trying to change. Indirect research would be based off a pre-existing evidence-based practice, where we would be looking for fidelity of the app to that evidence-based practice.

Research suggests that about only 1 to 3 percent of mental health apps have any direct scientific evidence behind them
In this latter case, the app would be evidence-informed rather than evidence-based. An app like that might be a digital CBT tool, that has some fidelity to Cognitive Behavioral Therapy principles. And I would argue that there are various levels of evidence that we should be looking at for with these apps. Obviously, I would love it if every app out there had a clinical trial showing its benefit, but I will tell you that’s not the case. Research suggests that about only 1 to 3 percent of mental health apps have any direct scientific evidence behind them. But I think if it doesn’t, an app that is evidence-informed is probably better than an app that is not based on evidence-based treatment. I think, again, it’s degrees of evidence, and that’s one of the things that we explore at One Mind PsyberGuide, is trying to look at the various degrees of evidence that are supporting various products. 
LR: So, what you’re saying is that just as there is a hierarchy of what are considered highest levels of empirically backed treatment research, from randomized control trials down to anecdotal evidence, there are different levels of scientific evaluation that apps can be subjected to.
SS: That’s right. And I think I would add one other point, which is that in a lot of places we see that when treatments are adapted to new mediums, they often maintain their effectiveness. So, Cognitive Behavioral Therapy for depression has evidence that it works in person. It also works via teletherapy, in a group therapy format, as well as through self-help books. And so, to some degree, to continue to conduct the same level of studies as we move to new mediums may not be the most efficient use of our resources.

When we’re taking something to new mediums and apps, is this really a new treatment, or a new practice that’s being developed through this technology? Or is it taking something that’s worked before and packaging it in a new way? And so, I think that’s the thinking around the evaluation of indirect evidence. That an established intervention already works in various realities and formats gives a lot of confidence that it would likely work in this digital delivery format, as long as it shows fidelity to those evidence-based principles that that treatment involves.  
LR: We briefly mentioned self-help books. John Norcross, as an example, has done treatment outcome research at the highest empirical levels, but he has also written self-help books based on the same principles that drive his research. So that’s what you mean when you say if a therapeutic modality is robust and valid, we shouldn’t be that concerned with the transition into a different medium, such as digital technologies and apps.
SS: That’s right. Or at least we should be less concerned. The situations I worry most about are where new, innovative treatments are made possible using technology. I think those do need to meet really high standards of evidence to support their benefits.
LR: What would be an example of this?
SS:
I have some questions about the benefit of chatting with a computer program
I think there’s a lot of work to do around chatbot apps, where you would interact with the app as if you’re chatting with a person, or potentially a therapist. Although they’re often based on evidence-based principles, I have some questions about the benefit of chatting with a computer program

And similarly, I’m also curious about some of these virtual care platforms using text message-based interactions with a therapist. Does that work? And what is the benefit someone gets from text-messaging back and forth with someone, even if they don’t have credentials? How do we distill evidence-based psychotherapy practices into these very brief back-and-forth interchanges?

So, I think there’s a lot of places where we do need new evidence to suggest that these things are beneficial. And I think that there is some promising evidence supporting both chatbots and text message-based interactions as potentially being clinically efficacious. But I do think these are places where we need more research to support these practices.   
LR: Are these chatbot apps like virtual assistants, driven by artificial intelligence programs designed to provide human-type responses?
SS: There definitely are products like that. Three examples would be Woebot, Youper, and Wysa. All of these are apps where a user who downloads the app would be able to message back and forth with this virtual agent that is going to provide back full-text answers. Again, they’re often based on therapeutic principles. But I think that these are types of things that were not possible just a brief time ago. This is not like taking a self-help book and digitizing it. This is a very new type of thing that is possible because we have computer programs and software that can do these types of interactions.
LR: Would these types of virtual assistants be programmed with keywords that might be sent off to a therapist if the person is simultaneously working with a “live” therapist, or are they completely asynchronous standalone surrogates for therapy? 
SS: It’s a little of both. You couldn’t take this program and bring it to your therapist and say, “Okay, I’m going to use this on the side, and it’s going to reach out to you if these certain words come up.” Some of the programs are designed to communicate directly with a therapist. Or they are a gateway. One way to think about these is as a low-intensity first step that can then introduce or connect someone to a therapist if necessary. And some of these programs do have that model, where if there is need for a therapist, they can step up to that higher level of care. But these aren’t the types of things where you as a client would say, “Okay, I’m going to use this in conjunction with a therapist I’m seeing.”
LR: I know that there are apps for medical care. For instance, those that monitor cardiovascular activity and then send that data to a physician or a physician’s assistant. Are there ways for some of these apps to communicate directly with a therapist, who then would respond to the client? 
SS:
There definitely are some apps that try to digitize measurement-based care, to allow some communication or transmission of data based on symptom tracking
There definitely are some apps that try to digitize measurement-based care, to allow some communication or transmission of data based on symptom tracking or logging, or other types of things that people would be doing or as part of the treatment that they’re receiving and feeding that information back to their therapist.  

The Wild Frontier

LR: In the “old days,” people crowded the self-help aisles at Barnes & Noble or other bookstores. Today, in contrast, e-consumers routinely scroll through platforms like Amazon. How do folks who may not be ready or interested in taking the step into therapy find their way through this labyrinth of 10,000 to 20,000 apps? Is there some sort of roadmap, or a central directory?
SS:
there’s only about a handful of mental health apps, about 10, that have been cleared by the FDA
I think it’s hard. And I’ll say that there’s no one centralized hub. But I think most consumers go to the app stores and they put in keywords like depression, anxiety, or stress, or whatever they’re struggling with. But I think that the app stores do a very poor job differentiating these products, because most of the search results bring up apps that have four-and-a-half to five stars. That doesn’t really provide a lot of information about the difference between these apps, or which are the evidence-based ones. Relatedly, a lot of people hope or think that the FDA is going to solve this problem. I will say that the FDA has cleared some mental and behavioral health apps, starting with Reset back in 2017, which was an app focused on substance use disorders. But since then, there’s only about a handful of mental health apps, about 10, that have been cleared by the FDA. But that’s 10 out of 10,000 to 20,000 over a period of about five years, which is about two products per year that are being evaluated and cleared.

There is a class of products about which the FDA has said that “they are exercising enforcement discretion,” which means, “We probably could regulate these, but given our assessment of the risk-benefit ratio, we’ve decided not to.” Examples of apps in that category are those that allow consumers with diagnosed mental health conditions to self-manage their own symptoms, such as by providing a tool of the day or different behavioral coping skills. A lot of people think that the FDA regulation shows that something is efficacious or effective, but in actuality the FDA is mostly concerned about safety. They’re looking at the risk profile of these products, and then clearing it based on that. This is all to say that FDA is not really doing much or has not done much in this space. At the beginning of the pandemic, they paused their review of products in this space given the potential need for digital services to help support mental health problems in the pandemic. So, this is a space that’s been traditionally messy and has gotten even more so over the past couple of years.

I think a couple of places that I would point to as being better able to provide more information for consumers are the Veterans Administration and the Department of Defense. While they are mostly focused on veterans, their apps and evaluation procedures are also useful to diverse consumers, especially for therapists who are providing some of these evidence-based practices. And my project, One Mind PsyberGuide, which really tries to collect and provide some of this information for consumers to help them make informed decisions.  
LR: So, with the exception of the small handful of apps the FDA and the VA and DOD have approved, publishers of mental health apps do not have to post any black box warnings.
SS: That’s exactly right. There’s little regulation of this space outside of the area that the FDA decided that they’re going to regulate, which, as you mentioned, is quite small.
LR: What are some of the criteria that a consumer should be looking at when they go to the app store?
SS: I think there are three main buckets of elements that are important to consider when searching for a mental health app. Credibility or evidence base, user experience, and then safety, especially related to privacy and data security.

Credibility or evidence base goes back to the conversation we were having earlier around the evaluation of the evidence behind these products. Is there either direct (evidence-based) or indirect (evidence informed) support of the app’s effectiveness?

User experience, which is subjective, is about whether the app is easy to use, easy to learn, aesthetically pleasing, free of technical glitches, engaging, something you would come back to? Based upon this criterion, users can narrow down a set of apps to a selection of three to four and then try each of them out to see which works better for their needs.

we did a review of security policies on 120 depression apps and found that about half didn’t have any policy whatsoever
Lastly, safety and security issues are related to data security and privacy. What is their privacy policy? What do they do with your data? Who is it accessible to? A few years back, we did a review of security policies on 120 depression apps and found that about half didn’t have any policy whatsoever, so they told you nothing about what they did with your data, which was a major red flag to us. And of the half that did have data security and privacy policies, using our scale that we developed at One Mind PsyberGuide, half of these were deemed unacceptable. These apps didn’t provide their data security and privacy policies until after you already put in information about yourself. So, for example, you would create a user profile by putting in your personal information, only after which the app would tell you, “Okay, now we’ll tell you what we do with our data.” That would be a pretty easy red flag for a consumer.  
LR: In this Wild West of the internet, what entities might data be shared with?
SS: Often, it’s back to some of the big tech companies—the Googles and the Facebooks, where one’s data might be used for advertising or other marketing purposes. That would make me a little uncomfortable with mental health apps, although, honestly, I do use products that are associated with those worlds. With some of these apps, consumers just won’t know.

I talk a lot about the importance of transactional value for data in this space. So, what do I get back, and does that align with what I’m using the data for? With Google Maps, for example, I’m sharing my location information, but in return, it’s helping me navigate to somewhere based on my location. That’s the transactional value, but it feels a little bit different when it comes to mental health apps. Why do they need to know my location?  
LR: And since the FDA has only regulated a very small percentage of the apps, I imagine the potential for consumer deception is very great.
SS:
there is a misconception where some people assume that if there’s data present, these apps must be regulated under HIPAA
That’s right. I think another thing is that sometimes there is a misconception where some people assume that if there’s data present, these apps must be regulated under HIPAA. But it’s important to realize that HIPAA is related to data that’s coming from covered entities, which in our case would be traditional health care providers. If an app is sharing information with a health care provider like your therapist, it should be, and hopefully is, following HIPAA regulations. But if there’s not a covered entity, then a lot of these apps are not regulated by HIPAA regulations, and they can change their terms of services or privacy policies without having to get approval from you. I’m much more comfortable with apps that are not collecting or sharing data, like a lot of the VA and DOD ones that don’t collect or share your information.
LR: I would also imagine that if a therapist assigns or recommends a particular app to a client, there’s the issue of potential vicarious liability. It would therefore behoove the clinician to become aware of all these different elements of the apps, particularly their privacy policies.
SS: That’s exactly right.
LR: Have you found that there are particular mental health conditions or client types that are more amenable to the use of mental health apps?
SS:
There’s a lot of evidence to support the use of these tools for depression and anxiety
There’s a lot of evidence to support the use of these tools for depression and anxiety. That doesn’t necessarily mean that these conditions are more amenable to apps. It’s more a reflection of where the research started and what information has accumulated. What I often say is that everything that has been treated with a psychosocial intervention has a digital tool or app that might be useful.
LR: And relatedly, some of the most effective treatments for anxiety and depression are cognitive behavioral. Have you also found some useful trans-theoretical mental health apps or those that capitalize on other types of interventions like Gestalt, or Psychoanalytic, or Existential?
SS: A lot of the apps out there are based on Cognitive Behavioral Therapy principles, but I do think there are some that could be amenable to some of the other treatments like you mentioned. Especially if we think about some of the general aspects of some of these apps. For example, you might be interested in tracking your mood or your symptoms, or different goals or values you have over time. You could imagine an app like that could be useful in a variety of different treatments.

It has more to do with the theoretically aligned goals that you’re trying to achieve in those treatments and what products might support those goals that you’re trying to accomplish. But you’re right in suggesting that a lot of the tools out there are CBT-based. We recently did a study in which we reviewed apps with different features of thought records for Cognitive Behavioral Therapy. Traditionally, a therapist using CBT would give their client paper thought records to keep between sessions.

Since there are now all these digital tools that are promising or promoting that they can do this, we went back to see how faithful they were to traditional paper-and-pencil thought records. What we found is that although the set of apps we reviewed all had some elements of thought records, very few had all the elements. So, I think this is an important call for, if you’re a therapist or if you’re a consumer, to look under the hood of the app and to see what’s present in it. Pilot it, so you know what’s there. Just because it says it’s a cognitive behavioral therapy app doesn’t mean it has all the elements that you would want to be using, either as a provider or as a consumer.  
LR: Have you found that to be an “optimal consumer” profile for users of mental health apps, defined by a certain set of characteristics?
SS:
people who are young, tech-savvy, and motivated tend to do better with these apps
I think we see that people who are young, tech-savvy, and motivated tend to do better with these apps, especially on their own. In my own experience, older clients or those with less digital literacy might be a little bit more challenging to onboard. If you can train them and work with them, essentially providing a little bit of digital literacy training, these particular clients become most excited and engaged in using one of these tools. And for some of these clients, some basic digital literacy training or support can be useful in other areas of their life. I often tell clinicians to do some sort of assessment of their clients regarding their digital literacy skills, their interests, their previous experiences using apps, and health apps specifically. That information would help clinicians guide clients to the most appropriate and useful digital tool.

If they’re interested and willing to learn and excited to do so, that person might become a client who would be a good fit for a mental health app. I don’t think these tools are for everyone, and I would never, nor should a clinician ever force them on anyone. These should simply be a tool in the toolbox. It’s not the only thing we have available. But don’t assume if someone doesn’t fit the perfect profile, that there might not be some other ways to support them in using these tools. They might eventually end up being a very great fit and a very great client for it.  

Challenges

LR: So, young, motivated, tech-savvy—got it! What about marginalized clients? Those that have been and/or continue to be disenfranchised, whether due to SES, education, race, culture, age?
SS: Yeah, well, I’ll say this is a place that I think the field has really failed so far. There’s a lot of promise, and a lot of dialogue like, “Oh, we’ll build these technologies, and we’ll reach people who haven’t been reached otherwise. And we’ll expand access.” The reality of the situation currently is that a lot of these products are made for White majority individuals, in terms of the language (English), the imagery, and the style of the dialogue that’s present.

The reality of the situation currently is that a lot of these products are made for White majority individuals
I think that’s shifting a little bit. I think there definitely are developers and entrepreneurs who are creating products that are tailored for traditionally marginalized and underserved groups. And I think that’s important. It’s something we’ve seen in both research studies and in our experience talking to consumers. Products that are tailored to specific populations are more effective and engaging, and those consumers see them as more appealing. But I think the reality of the situation is if you try to find a Spanish-language app or one tailored to another underserved group, there are far fewer out there. So, I think it’s a place where it’s an unfulfilled promise right now in this space, and more work needs to be done.  
LR: Sort of the digital equivalent of the finding that specialized populations need specialized services by professionals who are most familiar with their needs? 
SS: I think that’s exactly right, despite there being a lot of rhetoric of like, “Oh, we’ll have these products, and it gets around this problem, because we don’t have to rely on the provider. We’ve got technologies. But you still have to design it. It’s not technology—the apps must be able to meet the needs of these distinct groups. It’s not just going to be a one-size-fits-all and we can create a product without consideration of racial, ethnic, and cultural diversity.
LR: And availability is a self-limiting issue, because not everybody has an iPhone. Not everybody who has an iPhone knows what to do with it. And not everybody has a computer. If they do, it may just be for simple functioning. I don't know if I’m overstating it when I suggest that mental health apps and digital technology like this really favors the educated, the employed, the informed, the digitally familiar.
SS:  I don’t think it’s overstated. Even if we look at research studies, the most common participants are middle-aged White women. So, I think that’s the group we know a lot about who these tools work for.
LR: What role do you see mental health apps playing in working with suicidal clients or those in crisis?
SS: I think there’s a couple places where these tools can be useful. I think one is having these apps be collections of crisis resources. I know, for example, in the case of PTSD Coach that there was a safety planning tool and crisis support services tool directly in that app. And it was such a popular feature that they developed a standalone version of that containing provider resources. So, I think some of it is putting the resources in the pockets of people at the places and time that they need them the most and that they can save lives. I’ve been part of a team that has done a little bit of work in using these tools while a person is undergoing acute treatment. We were working with people who were on an inpatient unit, learning Dialectical Behavior Therapy skills, who used this app or got the app after leaving the setting as a reminder to use the tools.

We often talk about these tools as being on-ramps and off-ramps to mental health care
We often talk about these tools as being on-ramps and off-ramps to mental health care. On-ramps to introduce people to what is this whole therapy thing about, and what are some of the things I’m going to be learning in therapy? So, not replacing treatment, but getting someone ready so that they might be more willing to go and have started learning some of those skills. And then off-ramps being the booster sessions, or the reinforcement of the skills. And I think the same thing applies to individuals who are dealing with suicidal ideation or who have been through a suicide attempt, in that these tools might be ways to provide them reinforcement of some of the skills that might be able to help support some of the things that they learned.  
LR: So, mental health apps can have a wide range of usages for suicidal clients and other clients in crisis, but not as standalone resources.
SS: I think that’s exactly right. And a great point, and I think that’s something I should really emphasize and just say directly. I don’t think that these apps are replacements for therapists. But I also don’t think this is an either/or. This is a yes/and. I think that these tools can be useful in the toolboxes of therapists, as well as in toolboxes to provide mental health services broadly. And that we must think about ways in which technologies can really augment and support therapists to give them skills. Or give them resources to do things that they weren’t able to do before. But in all, I think that putting resources in the hands of clients at the times they need them is one of the biggest potentials of these tools.
LR: There’s a wide body of research that examines the impact of therapeutic relational variables on treatment outcome. When it comes to apps, that relational connection is absent. How might mental health apps, especially those that are asynchronous or not connected to a therapist, take the place of relationship? Or is it, again, not an either/or, but a yes/and?
SS:
we find that people do form relationships to products—in this case, apps
Yeah, I think it is a yes/and. We’ve done a little bit of research, as have others, looking at relational variables or therapeutic alliance to these products specifically. And we find that people do form relationships to products—in this case, apps. I think that people have attachments to their phones. It’s something I do often during in-person talks. I might say, “Everyone, hold up your phone,” and everyone whips their phone out of their pockets and shows like, hey, everyone has one of these. And I’m like, “Okay, now pass it to the person on your left.” And everyone looks at me like, “Why would I do that? I’m not giving up my phone. I’m not letting someone else touch it.” We can form attachments or feelings… I mean, not the same that we would to a therapist, but there are relational aspects that occur. I think sometimes with these apps, it’s to the authority or the sense of who developed this, and do we trust them? There are various aspects that come up. So, I think that’s one aspect.

I think another aspect, and this applies more to the products that do have some sort of human support or human component to it, is that having the smaller interactions sometimes can actually create a sense of connection or relationship. There was a study that a colleague of mine did where they had someone reach out to people. And they referred to this as mobile hovering. It was a daily text message from a person—not a therapist, not their therapist, but just someone who checked in—and would start out with three questions. Did you take your medication today? Have you had any side effects? And how are things going for you? And those were the three messages they got every day, and they got a response back. This was what was called mobile hovering. They had their therapist and their psychiatrist as well. And at the end of the study, they asked about relational variables, and the person felt most connected to the person sending them those three text messages every day, because they felt like they were really invested in them, and they were checking up on them. We’ve also done some work with automated text messaging — just pushing notifications to people every day. And clients will respond to them. And they’ll say, “Thank you.” We’ll tell them, “Hey, no one’s monitoring this. This is automatic.” Like, “Yeah, I just felt like I had to respond.” So, I do think it’s not the same. But there are relational things that come up, even with automated programs.  
LR: What about mental health apps for children and teens?
SS: Some research suggests that a lot of teens have used these types of tools. There was a nationally representative survey of folks 14 to 22, and about two-thirds had used a health app. And a lot of those were focused on mental health conditions, stress, anxiety, substance use, or were apps that used interventions that related to mental health, like mindfulness. Interestingly, if you looked at those with elevated levels of depression, those who met clinical cutoffs on standard measures, three-fourths of those teens had used a help app.

So, we find that they’re using these types of tools. I think one thing that is disappointing to me is that there aren’t a lot of apps that are really tailored for teens. And this goes back to some of the conversation we had earlier around traditionally underserved or marginalized populations. And I think the same thing occurs for teens, which is that a lot of the products that have been developed were developed for adults. And we typically youthify it by adding different images without really designing it with teens in mind.

we need to develop more products that are specifically designed for teens, with teens
So, I think it’s a place where there’s a lot of promise, and there’s a lot of potential. You mentioned some of them. Teens are on their phones often. They’re digital natives. They’re comfortable using technology. But we need to develop more products that are specifically designed for teens, with teens, in ways to make them better fits for that population.  

Evaluation

LR: Circling back to the early part of this discussion when we addressed the evaluation of mental health apps, can you describe what One Mind PsyberGuide does? 
SS: I can refer to One Mind PsyberGuide like a Consumer Reports or Wirecutter of digital mental health products. We identify, evaluate, and disseminate information about these products to help consumers make informed decisions. And we operate a website that posts all the reviews that we’ve done on them. We evaluate them on three dimensions related to the categories I mentioned earlier. We look at their credibility, user experience, and transparency around data security and privacy. And we say “transparency,” not “data security and privacy,” because we don’t do a technical audit of the app. We review their privacy policies. So, for example, if an app says that their data is safe and it’s encrypted, we don’t try to hack into their system so we can say, “Is it really encrypted?” We say, “Okay, we’ll take that at face value.” Our guide is designed to be mostly consumer-focused, geared toward people looking to use those products themselves. But we also know that a lot of clinicians turn to our product to be able to better understand what the evidence is base behind these tools.

We also provide professional reviews for some of the products that we review, by which I mean we have a professional in the field use the product, review the product, and write up a short narrative review about what are some of the pros and cons, and how might you use this tool in your practice or your life. That’s like a user guide or a user manual for these tools, because a lot of these apps don’t come with instructions like, “Well, this is how you might be able to use it to help benefit clients or yourselves.” So, we provide some of that information. And that’s one of the more popular sections of our website — those professional reviews around specific products.  
LR: Like what the Buros Mental Measurement Yearbook provides for psychological instruments.
SS: That’s right.
LR: I know the APA, the American Psychiatric Association, has its App Advisor. Is that similar or equivalent to One Mind PsyberGuide’s system?
SS:
The difference between the App Advisor at APA and what we do at One Mind PsyberGuide is the App Advisor is a framework that talks about the different areas you should be considering when you are evaluating an app
Yeah, I think it’s similar. The difference between the App Advisor at APA and what we do at One Mind PsyberGuide is the App Advisor is a framework that talks about the different areas you should be considering when you are evaluating an app. At One Mind PsyberGuide, we’re doing some of the evaluation and providing scores. The two systems can be quite complementary. What I often recommend for clinicians and providers is that you might use One Mind PsyberGuide as a narrowing tool, to be able to go from those 10,000 to 20,000 to a smaller subset that might be reasonable for you to look at. And then you could use the APA’s framework, to pilot and evaluate them yourselves.

As I mentioned, or as we’ve talked about, there’s a lot of ways these are like self-help books. And I wouldn’t recommend a clinician to give out a self-help book if they hadn’t read it or at least looked at it. So, I think the American Psychiatric Association’s framework is a good way to think about when you’re evaluating and looking at these apps, to identify the different features that you should be considering in your own review and evaluation of it.  
LR: As we close, Stephen, I recall your saying that you were working on and had just submitted a grant to SAMSHA. Are you at liberty to share what the grant was about?
SS: It’s loosely related to mental health apps, although it will be more exciting if we get the grant. SAMSHA is starting a Center of Excellence on social media and mental well-being. So, effectively, developing a clearinghouse to help summarize the research and the evidence-based practices that might help protect children and youth who are using social media and support them in being empowered and resilient in using those tools effectively. And providing technical assistance to youth and parents and caregivers and mental health professionals around what they might be able to do around children and youth and social media.

I think that it will be a great resource to help better understand what risks that social media plays, and how we might better help kids navigate that space. Because I do think that it’s an interesting challenge that was not present in my youth, in terms of the dangers, but also the opportunities that social media presents.  
LR: What are you most excited about now in this whole area of mental health apps? What really gets your blood flowing?
SS:
One thing I’m really interested in is how we can better use these tools to empower people who are not professionals to be able to support people in evidence-based ways
One thing I’m really interested in is how we can better use these tools to empower people who are not professionals to be able to support people in evidence-based ways. Or to embed them with extra skills that they don’t have. So, something that I’m really interested in is, as we’ve seen a lot of peer certifications programs develop across the country, how we might be able to better empower peers to connect or use mental health apps or digital products in their support of other people to bring evidence-based practices into the work that they’re doing.

So, how do we really scale with technology? Because I think that the current technologies we have, the most effective ones are those that have some form of human support. Although there’s a promise of scalability in technology, it’s not currently actual. That’s one aspect that I think is really exciting.

And another aspect that just kind of touches on the place that we’ve talked about a couple times is, how do we develop better products for different populations? For ethnic and racial minorities, for youth, for LGBTQ individuals? And I think that there are a lot of really exciting groups that are supporting that. The Upswing Fund, Headstream, different funding, and innovation platforms that are really trying to empower people from these groups to develop and evaluate products to show their benefit. Hopefully in a couple of years, I won’t have to say this is an unmet promise of this field.  
LR: In a related vein, is venture capitalism something that might really boost mental health apps to the whole next level? Or is it something that might undermine the quality of mental health apps?
SS: That’s a great question. Venture capital funding in this space has grown exponentially over the past decade. So, I am excited to see people excited. And excited to see people investing money in this space. But I think ultimately it will be determined whether this is going to lead to more effective resources for those in need. 
LR: Stephen, I appreciate your time. But even more, your incredible breadth of knowledge and passion in this burgeoning field. I’m going to close by thanking you.
SS: I appreciate your interest in the area.


Copyright Psychotherapy.net 2022
Order CE Test
$15.00 or 1.00 CE Point
Earn 1.00 Credits
Buy Now

*Not approved for CE by Association of Social Work Boards (ASWB)

Bios
CE Test
Disclosures
Stephen Schueller Stephen Schueller, Ph.D., is an Associate Professor of Psychological Science and Informatics at the University of California, Irvine. He received his BA in Psychology from the University of California, Riverside, PhD in clinical psychology from the University of Pennsylvania, and completed his clinical internship and postdoctoral fellowship at the University of California, San Francisco. At UCI he leads the Technology and Mental Health (TEAM) Lab, is a faculty member in the Connected Learning Lab, and the Jacobs CERES Center, and leads the implementation evaluation core for the Help@Hand Evaluation Team. He is also the Executive Director of One Mind PsyberGuide, a project the aims to empower consumers to make informed choices around digital mental health products. As a clinical psychologist and mental health service researcher his work broadly looks at creating more scalable mental health resources that make treatment more available and accessible, especially with technology. In his work he has developed, evaluated, and disseminated various web- and mobile interventions for the treatment and prevention of mental health issues, especially common mental health issues such as depression and anxiety. His research has been funded by the National Institute of Mental Health, the Jacobs Foundation, One Mind, Pivotal Ventures, and the Robert Wood Johnson Foundation. 

Stephen Schueller was compensated for his/her/their contribution. None of his/her/their books or additional offerings are required for any of the Psychotherapy.net content. Should such materials be references, it is as an additional resource.

Psychotherapy.net defines ineligible companies as those whose primary business is producing, marketing, selling, re-selling, or distributing healthcare products used by or on patients. There is no minimum financial threshold; individuals must disclose all financial relationships, regardless of the amount, with ineligible companies. We ask that all contributors disclose any and all financial relationships they have with any ineligible companies whether the individual views them as relevant to the education or not.

Additionally, there is no commercial support for this activity. None of the planners or any employee at Psychotherapy.net who has worked on this educational activity has relevant financial relationship(s) to disclose with ineligible companies.
Lawrence Rubin Lawrence ‘Larry’ Rubin, PhD, ABPP, is a Florida licensed psychologist, and registered play therapist. He currently teaches in the doctoral program in Psychology at Nova Southeastern University and retired Professor of Counselor Education at St. Thomas University. A board-certified diplomate in clinical child and adolescent psychology, he has published numerous book chapters and edited volumes in psychotherapy and popular culture including the Handbook of Medical Play Therapy and Child Life: Interventions in Clinical and Medical Settings and Diagnosis and Treatment Planning Skills: A Popular Culture Casebook Approach. Larry is the editor at Psychotherapy.net.

Lawrence Rubin was compensated for his/her/their contribution. None of his/her/their books or additional offerings are required for any of the Psychotherapy.net content. Should such materials be references, it is as an additional resource.

Psychotherapy.net defines ineligible companies as those whose primary business is producing, marketing, selling, re-selling, or distributing healthcare products used by or on patients. There is no minimum financial threshold; individuals must disclose all financial relationships, regardless of the amount, with ineligible companies. We ask that all contributors disclose any and all financial relationships they have with any ineligible companies whether the individual views them as relevant to the education or not.

Additionally, there is no commercial support for this activity. None of the planners or any employee at Psychotherapy.net who has worked on this educational activity has relevant financial relationship(s) to disclose with ineligible companies.

CE credits: 1

Learning Objectives:

  • explain the clinical utility of mental health apps
  • list the critical features of mental health apps
  • describe One Mind CyberGuide’s method of evaluation

Articles are not approved by Association of Social Work Boards (ASWB) for CE. See complete list of CE approvals here

This Disclosure Statement has been designed to meet accreditation standards; Psychotherapy.net does its best to mitigate potential conflicts of interest and eliminate bias in all areas of content. Experts are compensated for their contributions to our training videos; while some of them have published works, the purchase of additional materials are not required for any Psychotherapy.net training. Each experts’ specific disclosures can be found in their biography.

Psychotherapy.net offers trainings for cost but has no financial or other relationships to disclose.