Long meetings are the worst.
I confess: I like to read patents. It’s a great way to see what’s around the corner—to get an early gander at those big, crazy ideas that often morph into big, not-so-crazy ideas. And in the area of digital health, that seems to be especially true.
To be sure, some of those ideas look radically new and aren’t. Consider the buzz of late about Apple’s rumored potential-ish introduction-maybe of a wrist-worn blood-pressure-monitoring device—which was covered pretty much everywhere, with many outlets pointing to a recently published patent application as an aha moment. In truth, Apple engineers have been dreamscaping this one for a while (see their 2014 application for “Blood pressure monitoring using a multi-function wrist-worn device”). And the patent office is filled to the proverbial gills with filings for similarly conceived wrist-wearable sphygmomanometers—like I wasn’t going to use that term—some of which date back several decades. Note, for example, this 1993 patent grant for a “wrist mount apparatus for use in blood pressure tonometry” (U.S. Patent No. 5,271,405).
But then, some ideas are genuinely new. And well, fanciful. And well, maybe even not so crazy after all. Take this IBM patent application published last month, entitled: “Machine learned optimizing of health activity for participants during meeting times.”
The idea, write IBM Technical Leader Paul R. Bastide and colleagues, is to prompt participants in a lengthy conference call to exercise—presumably with the mute button on. “Based on the predicted engagement level, the sensor data and the participant’s fitness goal,” write the IBM team in their initial November 2016 filing, “an exercise for the participant to perform during the conference call may be determined. A notification signal may be transmitted to the participant to perform the exercise.”
There are, indeed, quite a few “may be’s” in this proffered invention: “An engagement level of the participant that is to participate in the conference call may be predicted based on the received data and the location data. Sensor data associated with the participant may be received…A user’s wearable fitness device may be interfaced with the smartphone, and together they detect in real-time the user’s participation in the conference call. This may be achieved explicitly by detecting whether the user is moving or not during the meeting, and/or implicitly via the app in the smartphone that analyzes the phone’s usage pattern (e.g., how often the user is engaged in the meeting using the phone’s audio sensors).”
Such hypotheticals aside, though, one has to admire the instinct of these imaginative IBM’ers to see a critical unmet need and try to fill it. The mere thought of my next endless meeting makes my back ache. My knees creak. My eyelids droop. If it takes another eavesdropping smartphone app to come up with a few standing reps or seat-borne shake-outs to subdue the scourge, so be it.