JV: You mentioned in our email correspondence that the latest news regarding the FDA stems from last year when they finally decided to approve apps on the devices due to concerns with the devices themselves (screen resolution, size etc.). Can you elaborate on that and what some of the concerns were?
JK: I'm no expert on this topic, but the biggest concerns I recall had to do with display characteristics, which includes the size of the screens, resolution, color depth, and other characteristics. Another concern was with the security of patient data. Display characteristics is a concern because they wanted to be sure that experts could make the same diagnosis on a small screen with a lesser resolution than a larger screen with a higher resolution. There are medical-grade monitors and an entire industry of hardware that are being developed and regulated just for diagnostics. Security is also important; you don't want your name, physical attributes, patient ID, insurance number, and other personal information to be either intercepted by or accidentally sent to someone who shouldn't see it.
In our case, we were classified as a Class II device, which the FDA does not “approve” but allows for the products to be “cleared.” This is not an endorsement but it allows regulated devices to be marketed.
JV: I know it’s been a while, but can you walk us through what it’s like to go through the FDA review process and the agency’s requirements?
JK: As for the requirements, I urge anyone who is involved in this to study the source themselves.
Again, I'm not an expert. We had an internal auditor who oversaw this. I plugged in from a testing perspective, so here are some of my experiences. (Note: This is not official, or thorough, these are just some aspects that I remember.)
It depends on how important or risky they rate your app. They have three categories: major, moderate, or minor level of concern. Essentially, if a malfunction can kill or seriously injure someone, then it is major. If it can harm someone, it is moderate. And if a malfunction won't cause someone harm, it is minor. Major level of concern designation is for devices that have direct interaction with the patient—they administer medications, monitor, help with bodily functions etc. We fell in the moderate level of concern designation because the app could be used to interpret patient data. The overhead goes up significantly the higher up the chain you go.
Some of the review process has to do with your own software development process. Is it repeatable? Do you have mechanisms in place to manage requirements, code, and testing? Do you have a code repository? How about code, design, and risk reviews? There is a lot to manage with regards to having a process and roles. Do you have someone who manages the requirements? How about the documentation for auditing purposes? They are fairly flexible on process, as long as you demonstrate some level of maturity with regards to oversight and control in what you do, and they are getting used to things like Agile and other lightweight methodologies. The biggest thing people think about is documentation, and yes, you do need a good deal of that. You need to have a document control system, including storage, access control, and the format of the documents needs to be consistent with proper titles, headers, and footer information. Some of these requirements are very similar to an International Organization for Standardization (ISO) certification.
You have to do a solid risk assessment where you essentially outline threats to patients and practitioners and list ways you mitigate them, through process, testing, etc. For example, a device malfunction might lead to a lack of ability to diagnose. So you mitigate that through requirements work, user scenarios, code design, code reviews, unit testing, user scenario testing, failure recovery testing, and things like that. If your risk assessment is light, then you will fail your audit, so it needs to be honest and thorough.