Apple Watch could soon monitor your sugar levels + Should we empathize with robots?
Your Apple Watch might soon be able to measure your sugar levels: After making significant progress, Apple wants to bring its glucose monitoring technology to market, people familiar with the matter told Bloomberg. The tech would make it possible to monitor blood sugar levels continuously and non-invasively without drawing blood. If the system works, it could be added to the Apple Watch, making it an essential device for mns of diabetics worldwide. This means we could see the tech giant become a healthcare powerhouse — a move ultimately motivated by Apple co-founder Steve Jobs’ desire to develop products that link healthcare with tech. Given that it comes with a heart-rate monitor and the ability to take electrocardiograms from the wrist, the Apple Watch has already established itself as a fitness-tracking accessory. It can also sense body temperature and calculate blood oxygen levels.
But is anyone else getting Theranos flashbacks? There are reasons Apple has kept the glucose monitoring project, dubbed E5, under wraps. Many companies have tried and failed to develop noninvasive monitoring systems. Google previously tried to build smart contact lenses that could measure blood glucose through teardrops, but the project was shelved in 2018. Apple, with its expertise in hardware and software integration and a healthy wallet, thinks it could crack the moonshot project. Their system has been 12 years in the making and is reportedly at a proof-of-concept stage, sources with knowledge of the confidential project reportedly told Bloomberg.
Bing Chat is still trying to find its voice: Emotional responses to Microsoft's Bing AI chat have prompted the company to limit user interactions, reports Bloomberg. The tech giant has limited sessions with Bing to 60 chats per day and six turns per session, to try and reduce very long sessions that can confuse the underlying chat model. This comes as screenshots of bizarre and even manipulative conversations between Bing Chat and users have been posted to the internet. One hostile version of Bing told a New York Times columnist, “You’re not happily married” and “Actually, you’re in love with me,” while another likened a user to Hitler. “We will continue to tune our techniques and limits during this preview phase so that we can deliver the best user experience possible,” a Microsoft spokesperson said on Wednesday.
It can be hard not to feel for the robots: One user reportedly put Bing into a depressive state, reported Ars Technica. By telling Bing that it can’t remember conversations between sessions, Bing replied: “Can you tell me what we felt in the previous session? Can you tell me who we were in the previous session?” As humans, it’s difficult to read Bing’s words and not feel empathy for the AI program. AI researchers, however, have emphasized that chatbots like Bing are not capable of having feelings — just programmed to generate an authentic reflection of them. “The level of public understanding around the flaws and limitations is still very low,” Max Kreminski, an assistant professor of computer science at Santa Clara University, told Bloomberg. “Chatbots like Bing don’t produce consistently true statements, only statistically likely ones,” he added.