Name: Ayse Asli Ilhan

University: University of the Arts London

Institute: Creative Computing Institue

Program: MSc Data Science and AI

Assignment: Element 2

Unit Title: Computational Entrepreneurship

Introduction

Having extremely demanding responsibilities in the fast-paced field of psychotherapy, therapists are often overburdened with the arduous process of taking detailed notes during and after their sessions. That challenge is very time-consuming, putting them at risk of missing those crucial details that may significantly affect their patients' care. To that end, the team I was part of built Cogn.io, a cutting-edge tool to make note-taking increasingly accessible by leveraging cutting-edge diarization, transcription, and NLP models. In this essay, I discuss lessons learned in the course of our project brainstorming to technical and business strategy implementation, pointing out my engagement in the different phases.

Problem Identification and Solution Proposal

First, we brainstormed to pin down the critical issue: too much valuable time of a therapist wasted in manual note-taking, of what time is otherwise incumbent, laborious, and susceptible to human errors and biases. Reams of records need direction and summary for navigation and access. On top of that, there is the possibility that therapists do not have the time, judgment, or support to analyze situations in detail. We developed Cogn.io, a summary-generating information structuring tool based on session notes from therapy so that the therapist could focus more on interactions with the patient than on administrative things. We envisioned capabilities for the patient to start recordings in text, store them with structure, filter recordings, catch recordings, and generate services on theory and spotting of biases. Our goals were not merely to improve the time-effectiveness of the tool but, more importantly, to introduce its capacity to elevate the quality of therapy through copious, organized, and objective-based records.

Market Research and User Analysis

Knowing the user helped us to design a tool intended to satisfy him; therefore, we conducted in-depth market research and analyzed the user to define the specific pain points of therapists. Results of our research and interviews with therapists testified that there is such a program required that would answer the following requirements: capability to easily integrate with current EHR, quickness, accuracy in the transcription, and security of data.

Thus, it was my job to envision such user journeys and create an intuitive UX to meet such demands. We storyboarded such a user journey from the point of recording sessions to accessing summarized notes and ensured each of those steps was straightforward and efficient in their own right. This involved direct work with possible users to gain feedback and improve this design through reiterative processes. By focusing on the user experience, we intended to develop a tool more than being simply functional; also easy to use, and, therefore, better equipped to help therapists to manage their workloads.

Session Transcript Clustering and Analysis

Establishing an efficient system for the clustering and textual analysis of session transcripts was a substantial technical challenge. Being the part and parcel, the solution consisted of robust conversion techniques of speech into text using diarization and speaker identification to differentiate the differences between the therapists and the patients. For that, we developed advanced models for the given purpose with high accuracy and reliability.

The nature of psychology and language in itself imposes specific challenges. Such data is inherently complex to standardize and structure as the same symptoms or issues could be described differently by different therapists or patients. That would warrant complex approaches that accurately interpret the information gathered. For that purpose, we have been putting much effort into the development of our methods in such a way as to be able to cope with the complexities of the psychological language.

The technique was LDA on the transcriptions to retrieve the topics from the document set. It worked to analyze a massive volume of textual data, classifying patterns and themes within it, and then used those identified topics for clustering sessions to make exploration through session notes more organized and structured for the therapists.

I was involved in developing and deploying such AI models. In particular, I focused on the transcription and digitization accuracy and the efficiency of the topic modeling algorithms. Running testing and refinement processes in a loop provided the best results regarding performance and reliability. For example:

Accuracy of Transcription and Diarization: