One Thing You Should Never Forget When Evaluating Emerging Technology

evaluating emerging technology

In 2017 Google had to quickly push a patch to fix their Home Mini product. It was discovered that a product flaw set the device into continuous recording mode, sending a steady stream of your data to their servers. This was accomplished by a long-press of the touch panel on the top of the unit.

In 2012, the HP TouchPad was killed off after only 7 weeks on the market. The highly touted rival of the iPad was quickly deemed a failure as it was marred by WebOS software issues that resulted in a lack of consumer interest.

In 2009 the Twitter Peek device was launched and quickly died as an epic dud. It was an overpriced mobile device designed solely for the use of Twitter. However, it failed in even that singular task as it provided a poor user experience—only showing a few words from each tweet before requiring the user to awkwardly navigate clunky software.

Way back in 1997, Microsoft killed their Microsoft Bob product after it failed to gain traction and meet expectations. As Bill Gates wrote at the time, “Unfortunately, the software demanded more performance than typical computer hardware could deliver at the time and there wasn't an adequately large market. Bob died."

The tech industry is littered with failures like these. Many of them can be attributed to poor market research, design flaws, lack of sufficient software testing, or miscalculations related to consumer sentiment.

However, many times there’s an underlying perspective that’s too often neglected, dismissed, or simply overlooked when solutions are being evaluated for use.

A critical element for successful solution evaluation

What's often missing in the early evaluation of solutions such as these is the lack of a mission-centric approach. This is the notion that the utility of products and services need to be uniquely qualified to not only perform a given task but more importantly to do it in your particular environment. Notice I didn’t say all environments. Every product or service will have some sort of limitation. It’s up to you to determine if, when, and how that might impact your organization.

A technology may perform exceptionally well in a lab yet fail miserably when deployed to production. Understanding the end-state operational environment is key to determining whether any solution will ultimately succeed.

How to use mission knowledge to truly vet emerging technology

A few years ago we were asked to assess the state of industry for face liveness technology for one of our clients. Put another way, we needed to determine the maturity of mobile phone solutions that could detect spoofing attempts when performing face recognition. Was a user wearing a mask? Were they using a picture of their friend? Were they using a recording?

We believe challenges such as these are more than a simple bake-off of technical capabilities. Our approach is comprehensive and contextual, rooted in a deep appreciation for how the technology needs to work in the wild.

In this case, our mission knowledge of the client’s environment was critical in formulating our test and evaluation approach and informed our process throughout. This included analyzing the solution in terms of approved hardware platforms, availability of SDKs for expected integration, software security requirements, unique target user personas, operational use cases, and environmental conditions.

This understanding of the mission space negated the need for weeks of requirements gathering. We were quickly able to construct a testing methodology to put the technologies through their paces to discover what worked well and what didn’t.

Results of a mission-centric approach

For this particular 30 day test, we identified 11 environmental use cases, 5 different facial occlusion tests, and 7 presentation attack vectors with more than 6,000 test case executions. We discovered something quite interesting during our testing.

We knew the technology would be used across an array of operational environments, even if this requirement was not explicitly stated to us at the outset. To address this we tested under a variety of conditions including daytime, nighttime, inside, outside, under varying lighting conditions, and inside vehicles. What we found was that one of the best performing systems, which could not be spoofed in the lab, failed miserably once tested outdoors. So much so that it was basically inoperable.

This seems so simple yet up until that point was unknown, perhaps even for the vendor. This was a significant finding given our expectation that users would indeed use the system outdoors. While this may not completely disqualify a given technology it does serve as a valuable input for a more-informed assessment of risk going forward.

What this means for you

This is a prime example of a technology performing well under ideal conditions but once placed in an unfamiliar environment it struggles. For those providing emerging technology services, or clients who rely on it, it’s imperative that any analysis of technology is done so with a full consideration of the operational mission.

Without a mission-centric analysis, technology may be deemed prematurely acceptable, procured outright, and deployed widely only to be non-functional once in the hands of the end user. A narrow analysis results in impacts to your reputation, delays in solution delivery, and unnecessary financial costs.

Too often vendors weave fantastical stories of how their products and services will uniquely solve all of your problems. It’s up to you to make sure you instill a mission-centric methodology that will cut through the fluff and ensure the solution meets your particular needs. You don’t want to have to kill Bob.

Previous
Previous

The Standard(s) Challenge with Digital Identity

Next
Next

Why we should take the time to question low-code’s utility