Telemental Health: Delivery Models and Performance Measurement
Important Evaluation Topics
Discussions about cost were not usually related to consumer treatment
effectiveness, but to organizational efficiencies, such as travel
savings. Problems with access were discussed as a consequence of
distance; in other words, for rural and frontier areas, geographic
distances create problems of access to care. The reference to care
was most often directed towards a “shortage” of psychiatrists
in rural areas. When the possibility was raised of other vital shortages,
psychologists, professional social workers, and nurses were mentioned.
It may be worth noting that primary care physicians were not mentioned
in this regard.
Participants
presented anecdotal evidence that telemental health works
well. |
Other topics that were covered in the initial discussion of telemental
health service evaluation included age related differences. The
group agreed that the needs of seniors for telemental health services
are not really addressed in their projects, but that seniors are
an important group they must serve. All the participants agreed
that young people do well participating in telemental health services;
that the media seemed transparent to them. In other words, young
people gave it little special regard or notice. At least two of
the participants were working with populations where cultural appropriateness
was a barrier to the effective use of the technology. One participant
worked within a Native American community and expressed the opinion
that any evaluation would need to be specially constructed to achieve
cultural appropriateness in their setting.
A few of the ideas expressed in the opening discussion were directed
toward improvements – such as “just in time availability
of the technology – by making connection to telemental health
services more available in the rural environment, such as in schools.
Finally, a few of the participants discussed the need for an evaluation
of comparative diagnostic efficacy – what one participant
termed “data that tells us that it works.” A number
of participants felt one value of the technology that could be evaluated
was whether or not the availability of telemental health services
affected professional stability and retention rates.
None of the participants shared experiences related to evaluating
telemental health services. Instead, the next discussion focused
on an effort by the group to list important evaluation topics. The
key evaluation topics they chose were:
Access |
Outcomes |
Continuity of care |
Diagnostic accuracy |
Stability |
Confidentiality |
Timeliness |
Locus of care |
Compliance (e.g., consumer
follows TX plans) |
Disposition |
The topics were discussed in order of mention. After the group
felt that the list was complete, the facilitator invited them to
define and comment on the topics. They added very little more detail
than they had presented in their opening remarks. They did state
that in order to succeed in evaluating telemental health services,
they needed to collect more data than they had been.
Focus group members reported that they had little or no experience
with the evaluation of telemental health services. Nine participants
reported being involved in reporting for OAT, but none of them were
familiar with mental health evaluation tactics, systems, or with
the evaluation concerns in the public system. One respondent commented
on the efforts of OAT to produce a common evaluation form for telehealth,
which, in their opinion was “something that simply will not
work for telemental health.”
The original effort to develop an evaluation form for consumers
produced one that was “very complex, demanded a high reading
level - meaning most consumers could not fill it out reliably.”
Another respondent reported trying to implement a Minimum Data Set
approach, similar to that underway at the Center for Mental Health
Services. This respondent developed a form with questions for the
consumer, presenter, and consultant, each assessment being one side
of a page that takes a minute or so to fill out. In terms of content,
it works like a report card, with questions such as “did the
encounter fulfill intent, did it work, will the consumer get better.”
Many participants argued for control group studies, but agreed that
most studies in all specialties have converged on the finding that
access to specialty services increases the likelihood of improved
outcomes. Participants presented anecdotal evidence that telemental
health works well. One told the story of the psychiatrist who said:
“I think people are freer, more natural than when they come
here, and therefore I am getting more and better information.”
The psychiatrist posed the question whether it is better if a patient
drives 150 miles to see him, struggles to find parking, may be treated
rudely, and probably has a long wait.
A valuable set of performance indicators has been developed for
mental health and could be used by telemental health programs. The
performance indicators are currently being implemented in all states
for publicly funded mental health services. Supported by the Center
for Mental Health Services, the indicators are being developed through
the Mental Health Statistics Improvement Program (MHSIP).
The 1996 MHSIP Report Card covers domains of access, appropriateness,
prevention, and outcome from the point of view of the consumer.
These domains have provided the initial framework for indicators
developed by other organizations (the National Association of State
Mental Health Program Directors; the American College of Mental
Health Administration and the National Association of Psychiatric
Health Systems; and the Association of Behavioral Group Practices.
The Federal Government Performance and Results Act (GPRA, 1993)
was an external factor influencing the development of the MHSIP
Report Card. GPRA is federal legislation requiring federal programs
to have performance indicators in place by Fiscal Year 1999 (Manderscheid,
R.W., 2000).
MHSIP is a community of people working together since the early
1970s. MHSIP currently has a second version of the MHSIP Report
Card in draft form. An integrated set of mental health data standards
is also being developed through Decision Support 2000+. Telemental
health would do well to utilize the data standards being developed
and implement the MHSIP performance indicators found in other mental
health programs.
The Telemental Health Innovation >>
|