10 ways to detect people who are a threat to your clinical trial

Flaskdata.io helps Life Science CxO teams outcompete using continuous data feeds from patients, devices and investigators mixed with a slice of patient compliance automation.

One of the great things about working with Israeli medical device vendors is the level of innovation, drive and abundance of smart people.

It’s why we get up in the morning.

There are hundreds of connected medical devices and digital therapeutics (last time I checked over 300 digital therapeutics alone).

When you have an innovative device with network connectivity, security and patient privacy, availability of your product and integrity of the data you collect has got to be a priority.

Surprisingly, we get a  range of responses from people when we talk about the importance of cyber security and privacy for clinical research,

Most get it but some don’t.   The people that don’t get it, seem to assume that security and privacy of patient data is someone else’s problem in clinical trials.

The people who don’t work in security, assume that the field is very technical, yet really – it’s all about people.   Data security breaches happen because people or greedy or careless.    100% of all software vulnerabilities are bugs, and most of those are design bugs which could have been avoided or mitigated by 2 or 3 people talking about the issues during the development process.

I’ve been talking to several of my colleagues for years about writing a book on “Security anti-design patterns” – and the time has come to start. So here we go:

Security anti-design pattern #1 – The lazy employee

Lazy employees are often misdiagnosed by security and compliance consultants as being stupid.

Before you flip the bozo bit on a site coordinator as being non-technical, consider that education and technical aptitude are not reliable indicators of dangerous employees who are a threat to the clinical trial assets.

Lazy employees may be quite smart but they’d rather rely on organizational constructs instead of actually thinking and executing and occasionally getting caught making a mistake.

I realized this while engaging with a client who has a very smart VP – he’s so smart he has succeeded in maintaining a perfect record of never actually executing anything of significant worth at his company.

As a matter of fact – the issue is not smarts but believing that organizational constructs are security countermeasures in disguise.

So – how do you detect the people (even the smart ones) who are threats to PHI, intellectual property and system availability of your EDC?

1 – Their hair is better organized then their thinking

2 – They walk around the office with a coffee cup in their hand and when they don’t, their office door is closed.

3 – They never talk to peers who challenge their thinking.   Instead they send emails with a NATO distribution list to everyone on the clinical trial operations team.

4 – They are strong on turf ownership.  A good sign of turf ownership issues is when subordinates in the company have gotten into the habit of not challenging the VP coffee-cup holding persons thinking.

5 – They are big thinkers.    They use a lot of buzz words.

6 – When an engineer challenges their GCP/regulatory/procedural/organizational constructs – the automatic answer is an angry retort “That’s not your problem”.

7 – They use a lot of buzz-words like “I need a generic data structure for my device log”.

8 – When you remind them that they already have a generic data structure for their device log and they have a wealth of tools for data mining their logs – amazing free tools like Elasticsearch and R….they go back and whine a bit more about generic data structures for device logs.

9 – They seriously think that ISO 13485 is a security countermeasure.

10 – They’d rather schedule a corrective action session 3 weeks after the serious security event instead of fixing it the issue the next day and documenting the root causes and changes.

If this post pisses you off (or if you like it),  contact  me –  always interested in challenging projects with challenged people who challenge my thinking.

Develop project management competencies to speed up your clinical trials

The biggest barrier to shortening clinical trial data cycle times is not recruitment.   It is not having a fancy UI for self-service eCRF forms design.   It is not software.

It is not, to paraphrase Medidata, having the ability to Run Your Entire Study On A Unified, Intelligent Platform Built On Life Science’s Largest Database.

It is incompetence in managing a construction project.

That construction project is called designing a clinical trial and the information system for collecting and monitoring data.

For a long time, I thought that this was peculiarly an Israeli problem.

However, conversations with colleagues in the US and Europe suggest that late starts, feet-dragging and time-consuming  change requests may be the norm. Collecting too many variables in the data model is the norm. Complex, long forms that make life hard for the site coordinators is the norm,  Surfeits of edit checks and thousands of queries are the norm.

Most companies spend little  money on project management training and even less money on clinical project strategy development.  Most training is on process, regulatory compliance and standard operating procedures.

Rarely, do we see medical device companies spend money on competencies that will help employees construct clinical trial projects more effectively.

There are verbal commitments that are rarely action commitments.

Yet there is a direct linkage between clinical operations team knowledge and corporate revenue growth which is dependent upon delivering innovative drugs and devices to market.

One way management teams can maximise their investments in project training and clinical project strategy development (outsourced or in-sourced) is to link clinical operations team training to study management competency models that management can qualify and measure.

But the development of a clinical team competency model has strategic and operational barriers that must be managed to make it successful.

Clinical trial project management competency model example

Clinical team Competency Setup Considerations

1. Clinical people often think that building the ‘database’ is an art, not a science, and don’t like to be measured in what they perceive is a non-core skill.

2.  Your project  competency model must include both soft and hard skills training to make it effective.

3. Clinical trial management teams must focus on the competency requirements to make it work and it must be a hands-on approach.

4. You must be able to quantitatively measure the competencies (time to design forms, edit check design, monitoring signals, data cycle time, time spent in meetings, change requests).

5. Competency clinical trial management training programs must be continuous training and educational events, not a one-time event or else the program will fail.

6. The steps of your competency program must be very specific and delineated to make sure it can be delivered and measured.

7. Your clinical operations team must agree that the competencies you are measuring truly help them deliver the study faster (They don’t have to like doing it, just agree that there are required action steps to reduce data cycle times)

8. When implementing your project competencies audits, the certification should be both written and experientially measured to get an accurate reading of the clinical operations team member capabilities.

9. All project  competency certification candidates should have the ability to retest to confirm skills growth.

10. Project competency assessments should never be used solely as a management scorecard tool to make employment decisions about clinical operations team members.

To increase your company revenues and clinical project training success, build and deliver project competency models.

Why Microsoft is evil for medical devices

Another hot day in paradise. Sunny and 34C.

Not a disaster but still a PITA

We just spent 2 days bug-fixing and regression-testing code that was broken by Microsoft’s June security update to Windows operating systems and Explorer 11.    Most of the customers of the FlaskData EDC, ePRO, eSource and automated detection and response platform use Chrome or Firefox on their desktops.   This was no solace to site coordinators in one of the sites using Flaskdata.  They came into work on Monday and the hospital-standard Explorer 11 no longer supported our application.

Microsoft published KB4503259 as a cumulative security update but it was much more.  The update included major changes to the Explorer JavaScript engine. Its because of delightful black swans like this, running a SaaS business is not for the faint of heart.

I once wrote an essay on my cybersecurity for medical device blog called The Microsoft Monoculture as a threat to national security.

Why Microsoft is evil for medical devices

I suggested that the FDA might consider banning Windows as an operating system platform for medical devices and their accompanying information management systems.

One of my readers took umbrage at the notion of legislating one monoculture (Microsoft) with another (Linux) and how the Linux geeks are hooked on the CLI just like Windows users are hooked on a GUI.

The combination of large numbers of software vulnerabilities,  user lock in created by integrating applications with Windows,  complexity of Microsoft products and their code and Microsoft predatory trade practices are diametrically different than Linux and the FOSS movement.

The biggest threats to medical devices in hospitals is old Windows versions

One of the biggest threats to medical devices in hospitals is the widespread use of USB flash disk drives and Windows notebooks to update medical device software. With the infamous auto-run feature on Microsoft USB drives – flash memory is an easy attack vector for propagating malware via Windows based medical devices into a hospital network. This is one (and not the only) reason, why I am campaigning against use of Windows in medical devices.

This  has nothing to do with the CLI or GUI of the operating system and personal preferences for a user interface.

This has everything to do with manufacturing secure embedded medical devices that must survive in most demanding, heterogeneous and mission critical environment one can imagine – a modern hospital.

I never advocated mandating Linux by law for medical devices.

It might be possible to mandate a complex set of software security requirements instead of outlawing Windows in embedded medical devices as a more politically-correct but far more costly alternative for the the FDA and the US taxpayer.

Regardless of the politics involved (and they are huge…) – if the FDA were to remove Windows from an approved list of embedded medical device operating systems – the costs to the FDA would decrease since the FDA would need less Windows expertise for audits and the threat surface they would have to cover for critical events would be smaller.

Killed by code in your connected medical device

patient compliance in medical clinical device trials

Are we more concerned with politicians with pacemakers or families with large numbers of connected medical devices?

Back in 2011, I thought it would only be a question of time before we have a drive by execution of a politician with an ICD (implanted cardiac device). May 2019, with mushrooming growth in connected medical devices (and after the Israeli 2019 elections), I am rethinking my risk analysis.

Consider this: If a typical family of 2 parents and 3 children have 5 mobile devices, it is a reasonable that this number will double with medical IoT and software as devices for diabetes management, asthma monitoring, fetal monitoring, remote diagnosis of children, home-based urine testing and more.

So far, it seems the politicians are still around, but the cybersecurity vulnerabilities for medical devices are growing in frequency and impacting big medical device vendors like Medtronic as reported by FDA in March 2019 – Cybersecurity Vulnerabilities Affecting Medtronic Implantable Cardiac Devices, Programmers, and Home Monitors

Audience: Patients with a Medtronic cardiac implantable cardioverter defibrillators (ICDs) or cardiac resynchronization therapy defibrillators (CRT-Ds)

-Caregivers of patients with a Medtronic ICD or CRT-D

-Cardiologists, electrophysiologists, cardiac surgeons, and primary care physicians treating or managing patients with heart failure or heart rhythm problems using a Medtronic ICD or CRT-D

-Medical Specialties

-Cardiac Electrophysiology, Cardiology, Cardiothoracic Surgery, Heart Failure

Purpose: The U.S. Food and Drug Administration (FDA) is issuing this safety communication to alert health care providers and patients about cybersecurity vulnerabilities identified in a wireless telemetry technology used for communication between Medtronic’s implantable cardiac devices, clinic programmers, and home monitors. The FDA recommends that health care providers and patients continue to use these devices as intended and follow device labeling.

Although the system’s overall design features help safeguard patients, Medtronic is developing updates to further mitigate these cybersecurity vulnerabilities. To date, the FDA is not aware of any reports of patient harm related to these cybersecurity vulnerabilities.

In Jan 9, 2017 FDA reported in a FDA Safety Communication on “Cybersecurity Vulnerabilities Identified in St. Jude Medical’s Implantable Cardiac Devices and Merlin@home Transmitter.

At risk:

-Patients with a radio frequency (RF)-enabled St. Jude Medical implantable cardiac device and corresponding Merlin@home Transmitter

-Caregivers of patients with an RF-enabled St. Jude Medical implantable cardiac device and corresponding Merlin@home Transmitter

-Cardiologists, electrophysiologists, cardiothoracic surgeons, and primary care physicians treating patients with heart failure or heart rhythm problems using an RF-enabled St. Jude Medical implantable cardiac device and corresponding Merlin@home Transmitter

Different classes of device. Different threat scenarios. A wellness app does not have the same threat model as implanted devices

I’ve been talking to our medical device customers about mobile security of implanted devices for over 7 years now.

I  gave a talk on mobile medical device security at the Logtel Mobile security conference in Herzliya in 2012 and discussed proof of concept attacks on implanted cardiac devices with mobile connectivity.

But – ICD are the edge, the corner case of mobile medical devices.

If a typical family of 2 parents and 3 children have 5 mobile devices, it is a reasonable scenario that this number will double withe devices for fetal monitoring, remote diagnosis of children, home-based urine testing and more.

Mobile medical devices are becoming a pervasive part of the Internet of things; a space of  devices that already outnumber workstations on the Internet by about five to one, representing a $900 billion market that’s growing twice as fast as the PC market.

There are 3 dimensions to medical device security – regulatory (FDA), political (Congress) and cyber (vendors implementing the right cyber security countermeasures)

The FDA is taking a tailored, risk-based approach that focuses on the small subset of mobile apps that meet the regulatory definition of “device” and that the software as a device mobile apps:

-are intended to be used as an accessory to a regulated medical device, or

-transform a mobile platform into a regulated medical device.

Mobile apps span a wide range of health functions. While many mobile apps carry minimal risk, those that can pose a greater risk to patients will require FDA review. The FDA guidance document  provides examples of how the FDA might regulate certain moderate-risk (Class II) and high-risk (Class III) mobile medical apps. The guidance also provides examples of mobile apps that are not medical devices, mobile apps that the FDA intends to exercise enforcement discretion and mobile medical apps that the FDA will regulate in Appendix AAppendix B and Appendix C.

Mobile and medical and regulatory is a pretty sexy area and I’m not surprised that politicians are picking up on the issues. After all, there was an episode of CSI New York  that used the concept of an EMP to kill a person with an ICD, although I imagine that a radio exploit of  an ICD or embedded insulin pump might be hard to identify unless the device itself was logging external commands.

See my presentation ‘Killed by code’

Congress is I believe, more concerned about the regulatory issues than the patient safety and security issues:

Representatives Anna Eshoo (D-CA) and Ed Markey (D-MA), both members of the House Energy and Commerce Committee sent a letter last August asking the GAO to Study Safety, Reliability of Wireless Healthcare Tech and report on the extent to which FCC is:

Identifying the challenges and risks posed by the proliferation of medical implants and other devices that make use of broadband and wireless technology.
Taking steps to improve the efficiency of the regulatory processes applicable to broadband and wireless enabled medical devices.
Ensuring wireless enabled medical devices will not cause harmful interference to other equipment.
Overseeing such devices to ensure they are safe, reliable, and secure.Coordinating its activities with the Food and Drug Administration.

At  Black Hat August 2011, researcher Jay Radcliffe, who is also a diabetic, reported how he used his own equipment to show how attackers could compromise instructions to wireless insulin pumps.

Radcliffe found that his monitor had no verification of the remote signal. Worse, the pump broadcasts its unique ID so he was able to send the device a command that put it into SUSPEND mode (a DoS attack). That meant Radcliffe could overwrite the device configurations to inject more insulin. With insulin, you cannot remove it from the body (unless he drinks a sugary food).

The FDA position that it is sufficient for them to warn medical device makers that they are responsible for updating equipment after it’s sold and the downplaying of  the threat by industry groups like The Advanced Medical Technology Association is not constructive.

Following the proof of concept attack on ICDs by Daniel Halperin from the University of Washington, Kevin Fu from U. Mass Amherst et al “Pacemakers and Implantable Cardiac Defibrillators:Software Radio Attacks and Zero-Power Defenses”  this is a strident wakeup call to medical device vendors  to  implement more robust protocols  and tighten up software security of their devices.

The golden rule for digital therapeutics and connected medical devices

He who has the gold rules.   That’s all you need to know when it comes to privacy compliance.

In the past 5 years, a lot has happened in the digital health space. Venture funding in 2018 was close to $10BN and a lot of work is being done in the area of digital therapeutics and connected medical devices.

As our customers progress through their clinical trial journey to FDA clearance and post-marketing, we are frequently asked on how to achieve HIPAA compliance in an era of digital health apps, medical IoT and collection of RWD (real-world data) from patients.

I will try and help connected medical device engineering and regulatory managers make sense out of HIPAA and the HITECH Act (Health Information Technology for Economical and Clinical Health).

On January 25, 2013, the HIPAA Omnibus Rule was published in the Federal Register, which created the final modifications to the HIPAA privacy and security rule. You can see the source of the law here.

The HITECH Act created a supply chain trust model.

According to 45 CFR 164.502(e), the Privacy Rule applies only to covered entities (healthcare providers, health plans and healthcare clearinghouses). Going down the chain, covered entities have suppliers who are defined as BAS (business associates). A business associate is a supplier that creates, receives, maintains, or transmits protected health information on behalf of a covered entity or other business associates.

The HITECH Act requires suppliers in the chain of trust to comply with the Security Rule.   A medtech company and its’ cloud service providers, customer engagement service providers et al are all business associates.

The HITECH Act does not impose all Privacy Rule obligations upon a BA but:

1.BAs are subject to HIPAA penalties if they violate the required terms of their BA Agreement (BAA).

2.BAs may use or disclose PHI only in accordance with the required terms of its BAA

3.BAs may not use or disclose PHI in a manner that would violate the Privacy Rule if done by the CE

Down the supply chain and to the right

When we go downstream in the supply chain, the BAA becomes more and more restricted regarding permissible uses and disclosures.

For example, if a business associate agreement between a covered entity and a supplier does not permit the supplier to de-identify protected health information, then the business associate agreement between the supplier and a subcontractor (and the agreement between the subcontractor and another subcontractor) cannot permit the de-identification of protected health information. Such a use may be permissible if done by the covered entity, but is not permitted by the downstream suppliers in the supply chain, if it is not permitted by the covered entity’s business associate agreement with the contractor.

Concrete example of a digital therapeutic.

A physician (covered entity) prescribes a digital therapeutic app. The physician writes a script that is sent to a customer service center, which provides customer support to patients to download and use the app.

The healthcare provider will need a BA with the digital therapeutics company (or its customer service center that may be a separate business), who then has BAAs with other online suppliers for cloud and Braze customer engagement services. Graphically, the supply chain looks like this:

As we move down the supply chain and to the right, we see that the suppliers are providing specific and more restricted digital services.

Digital therapeutics HIPAA

 

The golden rule

Although a BA is a formal, regulatory requirement, it includes compliance with the HIPAA Security Rule and possible exposure to Privacy Rule disclosures. To a large degree, the Golden Rule applies – “He who has the gold rules”.   For early stage medtech and digital therapeutics companies, your customers have the gold. Do a good job on your homework on your security and privacy risk assessment.  Consider external threats as well as possible exploits and cascade attacks on your APIs.

Invisible gorillas and detection of adverse events in medical device trials

Weekly Episode #1 - Patients and study monitors are both people.

What is easier to detect in your study – Slow-moving or fast moving deviations?

This post considers human frailty and strengths.

We recently performed a retrospective study of the efficacy of  Flaskdata.io automated study monitoring in orthopedic trials. An important consideration was the ability to monitor patients who had received an implant and were on a long term follow-up program. Conceptually, monitoring small numbers of slow-moving, high-risk events is almost impossible to do manually since we miss a lot of what goes on around us, and we have no idea that we are missing so much. See the invisible gorilla experiment for an example.

One of patients in the study had received a spinal implant and was on a 6 month follow-up program dived into a pool to swim a few laps and died by drowning despite being a strong swimmer. Apparently, the pain caused by movement of the insert resulted  in loss of control and a severe adverse event. The patient had disregarded instructions regarding strenuous physical activity and the results were disastrous. 

It seems to me that better communications with the patients in the medical device study could have improved their level of awareness of safety and risk and perhaps avoided an unnecessary and tragic event.

Subjects and study monitors are both  people.

This might be a trivial observation but I am going to say it anyhow, because there are lessons to be learned by framing patients and monitors as people instead of investigation subjects and process managers. 

People are the specialists in their personal experience, the clinical operations team are the specialists in the clinical trial protocol. Let’s not forget that subjects and study monitors are both  people.

Relating to patients in a blinded study as subjects without feelings or experience is problematic. We can relate to patients in a personal way without breaking the double blinding and improve their therapeutic experience and their safety. 

We should relate to study monitors in a personal way as well, by providing them with great tools for remote monitoring and enable them to prioritize their time on important areas such as dosing violations and sites that need more training. We can use analytics of online data from the EDC, ePRO and eSource and connected medical devices in order to enhance and better utilize clinical operations teams’ expertise in process and procedure.

A ‘patient-centered’ approach to medical device clinical trials

In conditions such as Parkinsons Disease, support group meetings and online sharing are used to stay on top of medication, side effects, falls and general feeling of the patient even though the decisions on the treatment plan need to be made by an expert neurologist / principal investigator and oversight of protocol violations and adverse events is performed by the clinical operations team. There are many medical conditions where patients can benefit by taking a more involved role in the study. One common example is carpal tunnel syndrome. 

According to the findings of an August 3rd, 2011 issue of the Journal of Bone and Joint Surgery (JBJS), patients receiving treatment for carpal tunnel syndrome (CTS) prefer to play a more collaborative role when it comes to making decisions about their medical or surgical care. 

Treatment of carpal-tunnel syndrome which is very common and also extremely dependent upon patient behavior and compliance is a great example of the effectiveness of “shared decision-making, or collaborative, model” in medicine, in which the physician and patient make the decision together and exchange medical and other information related to the patient’s health.

As the article in JBJS concludes:

“This study shows the majority of patients wanted to share decision-making with their physicians, and patients should feel comfortable asking questions and expressing their preferences regarding care. Patient-centered care emphasizes the incorporation of individual styles of decision making to provide a more patient-centered consultation,” Dr. Gong added. 

In a ‘patient-centered’ approach to medical device clinical trials, patients’ cultural traditions, personal preferences and values, family situations, social circumstances and lifestyles are considered in the decision-making process.

Automated patient compliance monitoring with tools such as Flaskdata.io are a great way to create a feedback loop of medical device clinical data collection,  risk signatures improvement, detection of critical signals and communications of information to patients. Conversely, automated real-time patient compliance monitoring is a a great way of enhancing clinical operations team expertise.

Patients and study monitors are both people. 

Strong patient adherence in real life starts with strong people management

Hagit.jpg

Patient adherence in real-life starts in clinical trials determining the safety, side effects and efficacy of the intervention, whether a drug or a medical device.

Like any other industry – success in clinical trials is all about the people.

The hugely successful movie – “Hidden figures” tells the story of the gifted black women mathematicians who played key roles in the NASA space program in the Mercury and Apollo space programs. It is a moving, inspiring and (sometimes hilarious) story of how NASA, a dominantly white male organization came to accept diversity during American desegregation.

By comparison, the Israeli life science industry lives in a different time and place and women are in leadership roles at all levels  of Israeli life science companies.

In this 4 part series of articles, we will tell the story of the gifted Israeli women who are the   “Hidden figures” of the Israel biomed/biotech industry.

Women comprise about 65 percent of Israel’s biotechnology workforce, and about 13 percent of top management positions in companies listed on the Tel Aviv Biomed index. In order to find out what attracts Israeli women into this globally male dominated field, I talked to a number of well-respected women, tried to learn about their story, get acquainted with their mindsets and solve the “mystery” of Israeli women invading this field.

Part 1 of the series tells the story of Hagit Nof – former Country Manager of IQVia in Israel and  currently the COO & BD of nRollmed an Israeli startup that helps clinical trial sponsors speed up their study using online patient recruitment and optimization.

(IQVia is the world’s largest provider of biopharmaceutical development and commercial outsourcing services ).

Hagit has a great story of a dream come true for a person who was not afraid to make a risky decision at the right time and was able to build a career in the biopharmaceutical industry literally from scratch.

(more…)

What real-time data and Risk-based monitoring mean for your CRO

A widely neglected factor in cost-effective risk-based clinical trial monitoring is availability and accessibility of data.

RBM methods used by a central clinical trial  monitoring operation that receives stale data (any data from patients that is more than a day old is stale) are ineffective. Every day that goes by without having updated data from patients, devices and investigators reduces the relevance and efficacy of remote monitoring.

Real-time data is a sine-que-non for RBM.

Sponsors and Contract research organizations (CROs) should therefore approach real-time data and risk-based monitoring (RBM) as 2 closely related priorities for executing clinical trials. Use of modern data technologies for real-time data collection and remote risk-based monitoring will reduce non-value added rework, people and paper in clinical trials and help speed up time to statistical report.

(more…)

The 3 tenets for designing a clinical data management system

Abstract:
This post reviews the importance of 1) proper study design, 2) good data modeling and 3) realistic estimation of project timetables. The article concludes with a discussion of eSource and attempts to dispel some of the myths including how DIY EDC study build save time (they don’t).

Enjoy!

The trend of DIY: good for EDC vendors, less good for sponsors

The trend for small studies/IIS (investigator-initiated studies) is to use cloud EDC applications
that enable end-users to build eCRF and edit checks using a graphical user interface. This so-called DIY (do-it-yourself) approach is used by most cloud EDC vendors such as Medrio and Clincapture as a way of lowering their barriers to entry to the market.

However – what is good for vendors (lowered barriers to entry) is not necessarily good for sponsors (faster time to market of their innovative drug or medical device).

(more…)

How to ensure patient compliance in patient-centric clinical trials

Patient-centered clinical trials is a growing trend. As both drug and medtech companies increasingly explore use of medical IoT for clinical trials are we discovering new opportunities or forgetting old lessons learned? Dr. Jane Bluestein talks about how to ensure patient compliance by understanding the subtlety of differences between boundaries and rules

(more…)