Categories
Computers

Sony CRE-E10 Review: Well-Rounded Hearing Aids

[ad_1]

When Sony entered the over-the-counter hearing aid market two years ago, it did so with a pair of products: the CRE-C10 and the more expensive CRE-E10. I was dazzled by the minuscule C10—it’s still one of the hearing aid options I recommend the most—and assumed the E10 would be even more impressive. Now that I’ve finally landed a pair of E10 aids to test, I can assure you that the E10 isn’t so much an upgrade to the C10 as it is a wholly different class of product with its own pros and cons.

While both the C10 and E10 rely on an in-ear earbud-like design conceit, their general approach is considerably different. While the 1-gram C10 fits nearly entirely inside the ear, invisible enough to require a small retrieval wire to remove it, the 2.94-gram E10 is much more bulbous and visible. It looks more like a standard Bluetooth earbud than any other hearing aid I’ve tested, filling the concha with its rounded body. Since, as the old saying goes, all concha are not created equal, your comfort level while wearing these hearing aids may vary considerably. In my ears, the fit was snug but not tight—comfortable for wearing for a few hours but not all day. Sony provides just four pairs of eartips you can experiment with to help improve the fit.

Two black inear hearing aids with dark grey canal cushions

Photograph: Sony

The other big difference between the C10 and E10 is that while the C10 uses replaceable hearing aid batteries, the E10 features a more common rechargeable battery. The extra size of the device lets the E10 work for up to 26 hours (without streaming). The USB-C connectible and Qi-compatible charger provides enough juice for an additional two to three recharges.

Despite their larger size, the CRE-E10 aids do not feature any external controls, which is understandable because controls would be hard to access based on the way the aids sit in the ear. Instead, all controls are situated in Sony’s Hearing Control app (Android, iOS). This is the same app used for the CRE-C10, so I already had it installed, but I ran into immediate problems because the old aids were still registered to the app.

Side view of two black inear hearing aids

Photograph: Sony

To set up new aids, you have to remove the old ones from the app. To do that, Hearing Control requires you to enter a code sent to your registered email address. Naturally, I never received the code, so I couldn’t install the new set of aids. Eventually, Sony tech support instructed me to delete the app altogether and set it up again with a different email address—perhaps not the most elegant solution, but it worked to get me up and running.

[ad_2]

Source Article Link

Categories
Computers

Orka Two Review: Sleek Hearing Aids

[ad_1]

Founded in 2018, Orka Labs feels like a bigger and more established hearing aid company than it is, with polished hardware that’s now on its second edition.

The Orka Two is something of a hybrid between prescription and over-the-counter hearing aids. The devices are registered as prescription-class aids but are sold online as OTC products. Professional medical consultations and adjustments are available (and included in the price) but are not required if you decide to go it alone.

The hardware is traditional in form, a behind-the-ear model with receivers that snake into the ear canal via flexible wires. But while they are a bit oversized in comparison to similar designs (and rather heavy at 3.8 grams each), they are distinguished by their glossy AirPod-white color and curvy, teardrop design. The units carry no physical controls, which further improves their sleekness. For behind-the-ear hearing aids, these look about as good as you could expect—and much better than the usual industrial-gray aids that are now so commonplace.

As with most over-ear aids, I found the units a little clumsy to fit and in need of significant fidgeting to situate them properly in my ears. The usual collection of open and closed tips is included in the box. While I normally find that medium-sized tips fit perfectly for me, I found all but the smallest uncomfortably large.

Two side by side white overtheear hearing aids with grey ear canal cushions

Photograph: Orka

In keeping with its hybrid design, Orka offers two ways to configure the units. There’s a capable hearing test built into the app, which can be used to quickly make the appropriate settings. Alternatively, if you have a professional audiogram, you can snap a photo and upload it through the app. Then Orka’s in-house audiologists will tune your aids accordingly (in one business day). Any adjustments can be made by emailing or calling Orka for tweaks, though the company notes its “remote consultation” feature, where you can schedule an appointment directly through the app, is currently being revamped and is offline.

Orka’s app is straightforward to the point of being idiot-proof, with two primary operating modes. “Normal” is the low-environmental-noise mode that relies on the settings made via your audiogram or in situ hearing test, while “In Noise” is, well, self-explanatory. Here, Orka gets more aggressive with settings, using an AI algorithm to adjust its settings dynamically in response to your environment. A beam-shaping option in the In Noise mode lets you target your hearing on a single person or on “everyone.” Volume can be adjusted universally or individually for each ear.

As noted earlier, there are no physical controls on the units. Unusually, hardware controls are found on the charging case (which is good for about three charges). Here you’ll find a program button that cycles through the two operational modes and another pair of buttons for adjusting volume. Pay close attention: Volume up is paradoxically the button on the left and volume down is on the right. Despite the reversal, I ultimately found the case-mounted buttons a lot more convenient than fumbling behind my ears to find the right buttons. For users with mobility impairments, this could be a game changer.

[ad_2]

Source Article Link

Categories
Computers

Ceretone Core One OTC Hearing Aids Review: Tiny and Barely Useful

[ad_1]

Indiegogo-backed Ceretone is yet another hearing aid company aimed at people looking for a low-cost, low-complexity way to give their hearing a boost. At $349 for a pair—or $229 for a single ear’s aid—the tiny hearing aids are designed to have only a modest impact on hearing. Fortunately, they also make an equally modest impact on the wallet.

The first thing you’ll notice about the Core One is how small the hearing aids are. I weighed them at 0.96 grams each (with a small ear tip), which makes them perhaps the smallest aids I’ve tested to date—just a hair lighter than the Sony CRE-C10. The glossy white aids slip entirely into the ear canal, with only the recovery thread sticking out a few millimeters for retrieval. Unless you closely examine your ears, they are functionally invisible.

Out of the ear, they’re not so unobtrusive. Color-coded, cone-shaped ear tips (one blue, one red) provide a somewhat garish indication of which aid goes where. Only six ear tips, a pair of each in three sizes, are included in the box—although Ceretone also sent some clear tips on the side which I found a bit more comfortable. All of Ceretone’s ear tips are considered “closed” domes, which created a moderately distorted, echolike effect in my testing. At the very least, a broader selection of ear tips, including open domes that are more appropriate for users with mild hearing loss, would help to improve audio fidelity.

Two white inear hearing aids one with a blue cushion on the left and one with a red cushion on the right

Photograph: Ceretone

Echo aside, I found the Core One experience to be initially a little rocky, primarily owing to significant, screeching feedback whenever I touched the aids or the recovery thread in the slightest. While the amplification impact was readily apparent, the aids were hampered by this high-pitched interference. This was further exacerbated by problems getting the aids seated in my ears properly. It may not look like it at first, but there is a “right side up” to these aids, as the recovery thread is meant to angle downward out of the ear canal. I found this surprisingly hard to achieve owing in part to the small size of the aids, which resulted in me constantly having to fiddle with them.

The Core One hearing aids are not tuned to your audiogram, nor are any frequency equalization options available. Like many low-cost hearing aids, the volume boost is across the board, providing a steady but blunt amplification to all sounds in the spectrum. You’ll need the mobile app to control the aids, as there are no onboard hardware controls available (and no way to reach them anyway).

Even these controls are on the blunt side: Six volume settings and two program modes (standard and restaurant) are available in the app—and each has to be set individually for each aid. Bizarrely, there’s no indication of what the active volume or program setting is in the app. Instead, you have to tap a control button (say, “Volume up”) and listen for beeps to guess whether the audio is loud enough; three beeps mean you are either at minimum or maximum volume. The same goes for the program mode: One beep means you’re in standard mode, and two beeps mean you’re in restaurant mode. Again, visual cues that indicate the live status of these settings seem like a bare minimum to ask for, even in a budget hearing aid product.

[ad_2]

Source Article Link

Categories
Computers

Elehear Alpha Pro Review: Hearing Aids With Great Battery Life

[ad_1]

Hearing aids: Not only for the near-deaf? We’ve already seen one product in the emerging category of hearing aids designed for users with relatively mild hearing loss—the Olive Union Olive Max. Now there’s Elehear’s Alpha Pro, another affordable over-the-counter product that aims to acclimate users to what hearing aids can do … before things reach crisis mode.

Elehear’s Alpha Pro doesn’t break any new ground in the design department, offering a traditional behind-the-ear design with a receiver connected to the primary device via a thin wire—perhaps just a bit longer than most. The units are available only in a dark gray color, which I find more aesthetically pleasing and unobtrusive than the more common silver or beige (yech).

2 grey overtheear hearing aids side by side with white cushions

Photograph: Elehear

The units arrive unconfigured, but new users get a free 30-minute online session with an audiologist if they need help setting things up and getting the lay of the land. If you’re a first-time hearing aid user, this is a good idea, as the audiologist can guide you through which settings and eartips are likely to work best for you, not to mention provide general usage and cleaning tips. The audiologist (there’s just one at Elehear) can also help later, on an ad hoc basis, via phone and email.

The Alpha Pro’s hardware controls are simple, with an individual volume rocker on the back of each unit. They will work out of the box, without Elehear’s app, but you’ll need to delve into said app if you want to get the most out of the hearing aids. While the app is simple on the surface, there’s a lot more to it underneath. Naturally, individual volume controls dominate the main screen, with selections for controlling the amount of ambient noise reduction plus the ability to opt between a forward-facing speech focus or a 360-degree listening mode. I also found the Mute button here handy, which cuts out all amplification and lets you work in silence should you need some peace and quiet.

The Adjust tab lets you drill down further, where you’ll find four presets that correspond to various levels of hearing loss, from Mild to Moderate II. Elehear’s audiologist told me these are all tuned based on common hearing loss patterns—boosting high-level frequencies more than lower ones—but you can tweak them further by tapping the Edit icon, which opens a rudimentary equalizer where you can set levels for Ocean Wave, World Sound, and Birds Chirping (i.e. lows, mids, and highs). All of the settings on this screen can be made globally or per-ear. There are also four environmental modes—General, Restaurant, Outdoor, and TV—which are fairly self-explanatory. Elehear says the only real difference among them is the amount of noise reduction along with the use of the directional focus mode.

Overhead view of hearing aid kit including the hearing aids cushions case and instructions

Photograph: Elehear

[ad_2]

Source Article Link

Categories
Computers

Olive Union Olive Max Hearing Aids: For Mild Hearing Loss

[ad_1]

You don’t have to be nearly deaf to use a hearing aid. Many doctors urge patients to get started with the devices early, before hearing loss becomes critical. Olive Union’s Olive Max is the first hearing aid I’ve encountered designed for this specific purpose, built for users with “mild to moderate” hearing loss, which the company defines as 26 to 55 decibels of loss. That’s right in line with my diagnosis, so I figured I’d be a perfect candidate for these new devices.

Out of the box, you’re likely to say what I—and everyone I’ve been around—immediately said when I first laid eyes on the Olive Max: They sure are big. Like, really big. Each looks like a Bluetooth headset from the early 2000s, except you have to wear two. At least the units, in a two-tone white and gray design, look sporty, including a wrap-around ear hook that helps keep them in place. They also carry an IPX7 water-resistance rating. But at more than 12 grams each, they’re a solid four or five times the weight of a typical over-the-counter hearing aid. A total of eight different ear tips, in three different styles, are included in the kit to ensure you get a good fit.

Two white and black overtheear hearing aids floating side by side

Photograph: Olive Union

As hearing aids, the Olive Max units work roughly as advertised, and casual users can pop them out of the box and into their ears to get started with minimal fuss, though getting them hooked over your ear properly can be tricky, especially if you wear glasses. Controls on the back of each aid handle volume (independently for each ear) and let you select one of four environmental modes (TV, Meeting Room, Outdoor, or Restaurant). You can also use the buttons to toggle “Hear-Thru mode,” which lets you turn off environmental audio processing altogether if you simply want to use the Olive Max as Bluetooth earbuds.

You can fine-tune your listening experience in the My Olive app—though, bizarrely, the hearing aid manual does not mention that an app exists, or even that you can use the hearing aids as Bluetooth earbuds. (You want the My Olive app (Android, iOS), not the incompatible Olive Smart Ear app.) The app allows you to make the same adjustments as the physical controls, but it also offers a noise-reduction and feedback-cancellation feature (pro tip: max out both of these), and it includes a more detailed graphic equalizer that lets you fine-tune frequency response further.

You can’t test your hearing directly within the app, although a short questionnaire will hook you up with various “AI-recommended presets” based on your age and a few other basic inputs. If you want anything more refined, you’ll need to delve into the equalizer by hand, but this is mostly a trial-and-error situation. It’s also worth noting that the My Olive app includes an audio therapy system designed to help people with tinnitus. I don’t suffer from tinnitus so I wasn’t qualified to test this feature.

2 overtheear hearing aids floating beside a mobile device with a screen showing adjustment settings for the hearing aids

Photograph: Olive Union

[ad_2]

Source Article Link

Categories
Life Style

how generative AI aids in accessibility

[ad_1]

Close up of a smart phone screen with a thumb hovering over the ChatGPT app icon

Tools such as ChatGPT can level the field for scientists who are English-language learners.Credit: Alamy

In 2015, Hana Kang experienced a traumatic injury that damaged the left hemisphere of her brain, disrupting her facility for language and ability to process abstract thoughts. She spent the next six years rebuilding her memory, recovering basic mathematics skills and relearning Korean, Japanese and English. In 2022, she returned to finish her bachelor’s degree in chemical biology at the University of California, Berkeley.

Today, Kang works as a junior specialist at the university’s Center for Genetically Encoded Materials. She uses mobility aids and an oxygen concentrator to manage her chronic pain — physical tools that are essential to her well-being. But no less meaningful are the generative artificial intelligence (GAI) programs she turns to each day to manage her time, interact with peers and conduct research. Kang struggles to read social cues and uses chatbots to play out hypothetical conversations. These tools also help her on days when fatigue clouds her thinking — by transcribing and summarizing recordings of lectures she attends, gauging tone and grammar, and polishing her code. “Without these tools, I’d be very lost, and I don’t think I could have done what I’ve managed to do,” she says.

Artificial intelligence (AI) tools — including chatbots such as ChatGPT, image generators such as Midjourney and DALL-E, and coding assistants such as Copilot — have arrived in force, injecting AI into everything from drafting the simplest grocery list to writing complex computer code. Academics remain divided over whether such tools can be used ethically, however, and in a rush to control them, some institutions have curtailed or completely banned the use of GAI. But for scientists who identify as disabled or neurodivergent, or for whom English is a second language, these tools can help to overcome professional hurdles that disproportionately affect marginalized members of the academic community.

“Everybody’s talking about how to regulate AI, and there’s a concern that the people deciding these guidelines aren’t thinking about under-represented individuals,” says Chrystal Starbird, a structural biologist at the University of North Carolina at Chapel Hill. She recently turned her attention to how GAI can support diversity, equity and inclusion. “We have to make sure we’re not acting from a place of fear, and that we’re considering how the whole community might use and benefit from these tools.”

Friend or foe?

Shortly after OpenAI in San Francisco, California, released ChatGPT in late 2022, primary and secondary schools around the United States started banning chatbots amid fears of plagiarism and cheating. Universities worldwide soon followed suit, including institutions in France, Australia, India, China, the United Kingdom and the United States. Ayesha Pusey, a mental-health and neurodivergence specialist at a UK disability-services organization, learnt that some of her students were facing disciplinary action for using GAI. Pusey, who identifies as autistic, dyslexic and otherwise neurodivergent, uses these programs herself and says that although they can be used to cheat, they’re also invaluable for structuring her life. “I’ve had a lot of success just budgeting my time, down to the recipes I cook for myself.”

Indeed, using chatbots as a kind of digital assistant has been game-changing for many scientists with chronic illnesses or disabilities or who identify as neurodivergent. Collectively, members of these groups have long shared experiences of being ignored (see Nature Rev. Chem. 7, 815–816; 2023) by an academic system that prioritizes efficiency — stories that are now backed by data (see go.nature.com/3vuch31) .

For those who struggle with racing thoughts, it can be challenging to settle the mind when working. Tigist Tamir, a postdoctoral researcher at the Massachusetts Institute of Technology in Cambridge, has attention-deficit hyperactivity disorder, and uses chatbots — including a program called GoblinTools, developed for people who are neurodivergent — to turn that inner chatter into actionable tasks and cohesive narratives. “Whether I’m reading, writing or just making to-do lists, it’s very difficult for me to figure out what I want to say. One thing that helps is to just do a brain dump and use AI to create a boiled-down version,” she says, adding: “I feel fortunate that I’m in this era where these tools exist.”

By contrast, people including Pusey and Kang are more likely to struggle when faced with a blank page, and find chatbots useful for creating outlines for their writing tasks. Both say they sometimes feel that their writing is stilted or their narrative thread is muddled, and value the peace of mind that AI gives them by checking their work for tone and flow.

Four different AI generated images based on the same quote from a book describing a scene of a house with a dirt yard in the clearing of a wood

An AI-generated visualization of a woodland clearing described in the novel I Am Charlotte Simmons by Tom Wolfe.Credit: Kate Glazko generated using Midjourney

The usefulness of these tools extends beyond writing. Image generators such as OpenAI’s DALL-E allow Kate Glazko, a doctoral student in computer science at the University of Washington in Seattle, to navigate her aphantasia — the inability to visualize. When Glazko encounters a description in a book, she can enter the text into a program to create a representative image. (In February, OpenAI also announced Sora, which creates videos from text.) “Being able to read a book and see a visual output has made reading a transformative experience,” she says, adding that these programs also help people who cannot use a pencil or mouse to produce images. “It just creates a way to quickly participate in the design process.”

Levelling the field

Academia can also be a hostile place for scientists who are English-language learners. They often spend more time reading, writing and preparing English-language presentations than do those for whom English is their first language1, and they might be less inclined to attend or speak at conferences conducted in English. They are also less likely than fluent English speakers to be perceived as knowledgeable2 by colleagues, and journals are more likely to reject their papers (see Nature 620, 931; 2023).

Daishi Fujita, a chemist at Kyoto University in Japan, was educated in Japanese. Before GAI, Fujita says, “My colleagues and I would often say how we wished we could read papers in our mother tongue.” Now, they can use ChatPDF — a chatbot that answers users’ questions about the contents of a PDF file — alongside speech recognition and translation tools such as Whisper and DeepL to smooth the reading process. Particularly for literature searches or when researching unfamiliar topics, Fujita uses GAI programs to define words in unfamiliar fields and to quickly gauge whether a paper might be helpful, saving hours of work.

Generative AI can also be useful for structuring professional communications, allowing English-language learners to worry less over how their words might be perceived. María Mercedes Hincapié-Otero, a research assistant at the University of Helsinki who grew up speaking Spanish in Colombia, relies on GAI not just to structure and proof research papers, but also to draft e-mails and job applications. Passing her text through ChatGPT to check grammar and tone “helps make things a little more fair, as people like me often need to put more time and energy into producing writing at the required level”, Hincapié-Otero says. “I might ask someone to check, but if there’s no one available at the time, this becomes a great alternative.”

Similarly, Fujita has started using chatbots to help to structure and proofread his peer-review comments. Peer review is already more laborious for scientists who are English-language learners, Fujita says, but because of the small size of his field, there’s also the risk that he could be identified by his writing style. “As a native speaker, you can feel when a comment is written by a non-native speaker,” he explains.

Towards a better world

As much as GAI has been a boon for accessibility, it can also perpetuate existing biases. Most chatbots are trained on text from the Internet, which is predominantly written by white, neurotypical men, and chatbot outputs mirror that language. Kieran Rose, an autism advocate based in the United Kingdom, says that for this reason, he never uses AI to change his style of writing. “I absolutely see the usefulness of AI,” he says, but “I don’t apologize for how I communicate”.

Jennifer Mankoff, a computer scientist at the University of Washington, together with Glazko and other researchers, investigated the potential risks in a 2023 study3 in which scientists with disabilities or chronic illnesses tested GAI tools. Mankoff, who has Lyme disease and often experiences fatigue and brain fog, says that chatbots have proved helpful for tackling tedious tasks, such as collating a bibliography. But she and her co-authors also flagged instances in which chatbots returned ableist tropes, such as ChatGPT misrepresenting the findings of a paper to suggest that researchers speak only to caregivers and not to those receiving care. One co-author struggled to generate accurate images of people with disabilities: the results included disembodied hands and prosthetic legs. And although GAI programs can parrot rules for creating accessible imagery — such as providing the best colours for graphics that can be read by people with visual impairments — they often cannot apply them when creating content.

Claire Malone sitting at her home computer

Claire Malone uses AI for dictation.Credit: Claire Malone

That said, GAI can also bring joy to peoples’ lives. Speaking to Nature, scientists shared stories of using the software to create knitting patterns, recipes, poetry and art. That might seem irrelevant to academic research, but creativity is a crucial part of innovation, Mankoff says. “Particularly for creative tasks — ideation, exploration, creating throwaway things as part of the creative process — accessibility tools don’t have all of the capabilities we would want,” she says. “But GAI really opens the door for people with disabilities to engage in this space where interesting advancements happen.”

Claire Malone, a physicist turned science communicator based in London, is working on a science-fiction novel and uses AI to transcribe her thoughts through dictation — something she couldn’t do even a year ago. Malone has mobility, dexterity and speech conditions because of cerebral palsy, but in 2022, she discovered an AI tool called Voiceitt that transcribes atypical speech and integrates with ChatGPT. Whereas before she could type at six words per minute, “if I dictate, I can write at the pace that I speak”, she says, adding that the tool has been “transformative” in her work and personal life. In a LinkedIn post (see go.nature.com/3ixrynv), Malone shared how she can now get away from her desk and dictate text whenever inspiration strikes.

As for Kang, she’s started using GAI to re-engage with her creative and social outlets. Before her accident, Kang often wrote fiction and graphic novels, and she has started to do so again using ChatGPT and image generators. She’s also rebuilding her social life by hosting house parties and using ChatGPT to generate conversation topics and even jokes. Using chatbots to inject humour back into her relationships has helped her to reconnect with friends and break the ice with strangers, she says. “Humour feels like such an unimportant thing when you’re trying to rebuild a life, but if you can afford to be funny, it feels like you’ve succeeded.”

[ad_2]

Source Article Link

Categories
Computers

Jabra Enhance Select 300 Hearing Aids Review: Some of the Best We’ve Tested

[ad_1]

I’ve been covering hearing aids for WIRED for nearly three years now, and I regularly talk to users and prospects about them when I wear them in public. Regardless of what I’m testing, one brand name has consistently and repeatedly popped up during that time: Jabra.

The Danish brand has a long history making a variety of audio gear, but I’ve always associated it mostly with the Bluetooth headset craze of the aughts. The brand made an early entrance into the over-the-counter hearing aid market (via an acquisition), and it hasn’t let up since, releasing new OTC models at a steady clip.

The latest of these is the Jabra Enhance Select 300, the brand’s smallest and most advanced model yet. You wouldn’t really know it just from the look of the aids. These are fairly standard behind-the-ear models that, while quite small (2.64 grams each), don’t offer any obvious surprises. The demure gray chassis sits close to the back of the ear and snakes a silver cable to the ear canal. Each aid carries a single button on its reverse.

Grey rectangular case holding silver hearing aids with a one hand pulling a hearing aid out

Photograph: Jabra Enhance

Jabra front-loads a lot of the purchase process to ensure your aids arrive preconfigured. You can take an online hearing test or, as I did, upload a professional audiogram; either option allows Jabra’s audiologists to tune the product appropriately before it is shipped. The company also asks you to take a lengthy medical questionnaire to rule out any hearing-related medical problems before sending out the product. Eventually, the digital chatter can get a little tiresome: During the shopping process, Jabra even asks about your credit rating and suggests a monthly payment plan for its lowest-priced product if you say your credit is trash. Once you do place an order, Jabra barrages you with introductory emails and invites you to schedule an orientation with an audiologist to walk you through the hardware and the app. Admittedly, some of this is helpful—especially the Zoom orientation—but Jabra could stand to pump the breaks on the auto-mailer a bit.

There’s plenty to explore once your hearing aids arrive. For example, if you aren’t sure which type of ear tips are best for you, you’ll have ample room to experiment, because the company sends seven different baggies of them to try out, including open, closed, and tulip-style tips in a multitude of sizes. I counted 70 different tips in total, and I have no doubt that Jabra would happily send more if I asked.

With tips installed (I usually test with open tips), I found that getting the aids situated on my ears was made a bit easier thanks to a pinging sound that plays—Jabra calls it Smart Start—while you are guiding the receivers into your ear canal. Controls are as basic as they come: the button on the right aid turns the volume up for both aids, the one on the left turns volume down, and either one cycles through the programs—four in total—if you hold it down for a couple of seconds.

Naturally you’ll get a lot more out of the hearing aids if you connect your set to a mobile app, and Jabra actually has two apps to choose from. The Enhance Pro app comes up first in the app store, but the Enhance Select app is newer. They work about the same way, but since the Enhance Select is more recent I’ll write mostly about it. Primarily you’ll use the app to move among the four modes—All Around, Restaurant, Music, and Outdoor—all of which are self-explanatory. Each mode has extra options associated with it; for most you can select between “noise filter” to mute ambient sounds or “speech clarity” to boost conversational volume. These can be further customized thanks to three equalizer sliders corresponding to bass, middle, and treble frequencies. Volume can be set globally or individually per ear in the app as well. Of special note: Any customizations you make to programs aside from the All Around mode are reset to defaults once the hearing aids are put back into the charging case.

[ad_2]

Source Article Link

Categories
Bisnis Industri

How to use AirPods as hearing aids

[ad_1]

AirPods can work surprisingly well as hearing aids. Thanks to Transparency mode, AirPods Pro will boost the sound of your environment around you. They can give you freaky Spiderman super-hearing if you boost input volume to the max. And Conversation Boost on the latest AirPods Pro 2 can intelligently raise the volume of people talking to you (while lowering the volume of your music or podcasts).

My prescription hearing aids were out of action recently due to a battery problem, so for a few weeks, I used my AirPods Pro as hearing aids.

For the most part, they’ve been pretty good. They work best when set up properly, which is a bit of a chore, but here’s how to do it.

This post contains affiliate links. Cult of Mac may earn a commission when you use our links to buy items.

How to use AirPods as hearing aids

If you experience mild to moderate hearing loss, you can use AirPods Pro as great part-time hearing aids, boosting sound in noisy environments. They also work well as a substitute when you forget your regular hearing aids (or their batteries run out of charge).

Apple’s noise-canceling earbuds also can serve as a great starter device. Most prescription hearing aids cost between $3,000 and $10,000, and require professional hearing tests, fitting and adjustment. Before diving in, you might want to experiment with boosting your hearing with a pair of AirPods Pro.

There’s also no social stigma against wearing AirPods as there is with wearing hearing aids. No one will know you’re using them to augment your hearing — although you might appear rude at first keeping them in while you talk.

They can give you superhuman hearing capable of picking up whispered conversations across the room, too. Just boost the input volume to the maximum. Note: Be very careful listening to anything at high volumes; even just a few minutes at a loud volume can damage your ears even further.


AirPods Pro (2nd generation) with USB-C charging case

The latest model AirPods Pro 2 are among the best wireless earbuds you can buy for your iPhone and other Apple gear. They offer excellent sound quality, battery life and active noise cancellation, among other advanced features.


Buy Now

We earn a commission if you make a purchase, at no additional cost to you.

03/13/2024 02:09 pm GMT

Researchers say AirPods are already as good as hearing aids

Professional hearing aids are super-expensive, but researchers found that AirPods already work nearly as well as most hearing aids. In 2020, scientists pitted a pair of AirPods 2 and first-gen AirPods Pro against a basic hearing aid and a premium pair. In almost all the tests, they found the AirPods Pro were as good as the basic hearing aid and only slightly less effective than the premium pair. Even the AirPods 2 “helped participants hear more clearly compared with wearing no hearing aids,” the study found.

Apple doesn’t yet have approval from the Food and Drug Administration to sell AirPods as OTC hearing aids and hasn’t made any announcements about this possibility, but it’s rumored to come soon. The company has been adding a lot of features to AirPods Pro that make them suitable — including Conversation Boost, muting background sounds, and particularly Transparency Mode, which makes all this possible.

AirPods Pro work great as part-time or starter hearing aids, but they’re not a full-time replacement yet.

Apple’s earbuds still don’t work as well as my Oticon More hearing aids: The battery life is much shorter (AirPods Pro get about five to six hours versus 18 to  24 hours for the Oticons), and they aren’t as good at boosting voices.

I’ve been using Apple’s latest AirPods Pro (second generation), which Apple launched in September 2022, as substitute hearing aids.

You may not know you need hearing aids

Most people don’t know they are suffering from hearing loss. Studies show that only 20% of people with hearing problems wear a hearing aid.

I was in this boat myself. I had trouble hearing people speak in loud environments, but I had no idea of the extent of my deafness until I got fitted with hearing aids. When I first tried them on, I was astounded and almost choked up. Suddenly, I could hear the leaves rustling in the trees and sand crunching underfoot as people passed by: sounds I was oblivious to beforehand. It’s been an amazing experience getting my hearing back.

If you own AirPods Pro, perhaps you should try setting them up as hearing aids to see if they make any difference. If so, you can then explore getting prescription hearing devices.

Start with an audiogram

An audiogram showing hearing loss in the higher frequencies.
This is my audiogram after taking a professional hearing test. It shows moderate hearing loss in higher frequencies. The test tones had to be played louder in order for me to hear them. (Ignore the coffee stains!)

Before setting up AirPods as hearing aids, I highly recommend getting an audiogram, a chart that shows the results of a hearing test. It shows how loud sounds need to be for you to hear them at different frequencies — low-pitched through high-pitched. As you can see in the audiogram above, I have moderate hearing loss in higher frequencies: The test tones had to be played at much higher decibels in order for me to hear them.

If you import an audiogram, your AirPods will create an individualized sound profile for you, boosting the frequencies you have trouble hearing. It makes a huge difference.

If you’ve taken a professional hearing test, you’ll likely have been given an audiogram — usually, it’s just a printout. Simply take a picture of your audiogram with your iPhone and it’ll be ready to import.

Luckily, you don’t have to take a professional hearing test: Some free apps will create very accurate audiograms for you. (Jump to the section below detailing how to get a free audiogram using an app.)

However, you don’t need an audiogram to use AirPods as hearing aids. There’s a built-in hearing test during the setup process that gets pretty close.

AirPods as hearing aids: Step-by-step setup instructions

Here are the steps to transform AirPods into assistive hearing devices.

7. After taking the test, choose Custom Settings.

8. If you have an audiogram already saved, choose it when customizing Headphone audio.

Here’s what the whole process looks like with screenshots:

1. Open Settings on your iPhone or iPad, then Accessibility, then AirPods

Start in the Settings app under Accessibility
Start in the Settings app, then tap Accessibility.
Screenshot: Leander Kahney/Cult of Mac

First, connect your AirPods to your iPhone. (This works the same way on an iPad.) Then open the Settings app on your iPhone, then tap Accessibility. Scroll down to AirPods and tap that to select the AirPods you want to customize.

2. Scroll down to Audio Accessibility Settings

Choose the AirPods you want to set up. Make sure they are connected, or you may get an error.
Choose the AirPods you want to set up. Make sure they are connected, or you might get an error.
Screenshot: Leander Kahney/Cult of Mac

Make sure you are wearing your AirPods, then tap on them if your device says they are Connected. On the next screen, scroll down to Audio Accessibility Settings.

3. Tap Headphone Accommodations, then Custom Audio Setup

Under Headphone Accomodations, choose Cutom Audio Setup.
Under Headphone Accommodations, you will find the Custom Audio Setup section.
Screenshot: Leander Kahney/Cult of Mac

At the top of the screen, tap Headphone Accommodations, then tap Custom Audio Setup. The next screen asks you to add your audiogram.

4. Import your audiogram

To add an audiogram, just take a picture of it. The system automatically reads the numbers in the chart.
To add an audiogram, just take a picture of it. The camera automatically reads the numbers in the chart.
Screenshot: Leander Kahney/Cult of Mac

If you have an audiogram printout, hit Continue in the setup wizard. If doing this for the first time, you’ll first be asked to add your audiogram. Choose Add Audiogram, then import the chart using the iPhone’s Camera, Photos or Files app. I used my iPhone’s camera to take a picture of the audiogram I got from a professional hearing test, and the camera imported the results automatically.

After importing, if there’s any data missing, you’ll be asked to fill in any missing numbers.

5. Customize your AirPods Pro’s Transparency Mode

If any of audiogram data is missing during import, you'll be asked to fill it in.
If any of the audiogram data turns up missing during import, you’ll be asked to fill it in.
Screenshot: Leander Kahney/Cult of Mac

After hitting Save, you can adjust amplification (basically the volume level of the AirPods you’ll be using as hearing aids), left and right balance, and tone. You can turn on Ambient Noise Reduction and Conversation Boost, both of which should make it easier to hear people in face-to-face conversations. I recommend turning on both of these, and went with the default amplification. (I crank it up for Spiderman-like super-hearing as needed.)

Hit Done.

6. If you don’t have an audiogram, take the built-in hearing test (or use a free app)

A built-in hearing test asks you to rate Speech and Music. It's basic, but good for a start.
Apple’s hearing test asks you to rate samples of speech and music. It’s basic, but good for a start.
Screenshot: Leander Kahney/Cult of Mac

If you don’t have an audiogram, you can take Apple’s built-in hearing test to generate one. First, make sure you’re in a quiet environment. The test asks you to compare samples of music and voices, asking which sounds better — version 1 or version 2. If they sound about the same, choose the first one. The test takes just a couple of minutes.

Alternatively, you can use one of the free apps mentioned below to generate an audiogram.

7. After taking the hearing test, make sure to choose Custom Settings

At the end of the built-in hearing test, make sure to choose 'Custom Settings.'
Choose Custom Settings.
Screenshot: Leander Kahney/Cult of Mac

At the end of the built-in hearing test, make sure to choose Custom Settings.

8. If you have an audiogram already saved, choose it when customizing Headphone audio

If you already have an audiogram saved in the Health app, it'll appear here.
If you already have an audiogram saved in the Health app, it’ll appear here.
Screenshot: Leander Kahney/Cult of Mac

You may already have an audiogram saved in the Health app. If so, it’ll appear on the first screen after hitting Custom Audio setup in step 3 above.

Choose the latest audiogram, and hit Use Audiogram. You’ll then be able to customize settings, as in Step 6 above.

How to get an audiogram using a free app

The free Mimi hearing test app can create an audiogram to be imported into the Health app.
The free Mimi hearing test app can create an audiogram to be imported into the Health app.
Screenshot: Leander Kahney/Cult of Mac

If you don’t have an audiogram, you can use the Mimi Hearing Test or SonicCloud Personalized Sound apps. After taking the free hearing test, simply import the results into the Health app.

Both these apps mimic a professional hearing test. You listen to a range of tones at various frequencies and volumes, and the app creates an audiogram based on your responses.

I tried both apps and got basically the same results as the professional audiogram, which makes me confident they are both pretty accurate. However, the numbers from both apps were slightly off. (That could indicate my hearing deteriorated since my last professional hearing test.)

Obviously, if you want the greatest accuracy, head to your doctor or hearing specialist.

After taking the hearing test in either app, you’ll be asked to connect to Apple Health. Since this is private health information, you must permit the third-party apps to connect with the Health app. Hit Give Permission.

Your audiogram will be imported and will show up in the custom audio process.

Which AirPods offer Transparency Mode?

Using AirPods as hearing aids relies on Transparency Mode, which lets in outside sounds so you can hear what’s happening around you. Transparency Mode is often used when listening to music, but if you use your AirPods in Transparency Mode without any media playing, they act as hearing aids.

Transparency Mode is available on the following AirPods and Beats earbuds:

  • AirPods Pro (first and second generation)
  • AirPods Max
  • Beats Studio Buds
  • Beats Studio Buds +
  • Beats Fit Pro


AirPods Pro (2nd generation) with USB-C charging case

The latest model AirPods Pro 2 are among the best wireless earbuds you can buy for your iPhone and other Apple gear. They offer excellent sound quality, battery life and active noise cancellation, among other advanced features.


Buy Now

We earn a commission if you make a purchase, at no additional cost to you.

03/13/2024 02:09 pm GMT



[ad_2]

Source Article Link

Categories
News

Global Warming Hurts AIDS, TB, and Malaria Combat

Climate change and violence, according to the head of the Global Fund to Fight AIDS, Tuberculosis, and Malaria, are hampering these efforts.

According to the Fund’s 2023 results report, which was released on Monday, worldwide efforts to combat the diseases have recovered significantly after being severely hampered by the COVID-19 outbreak.

The Global Fund’s executive director, Peter Sands, has warned that unless “extraordinary steps” are taken, the world would most certainly fail to eliminate AIDS, tuberculosis, and malaria by 2030.

He even listed some benefits. In the countries where the Global Fund invests, for example, a record-breaking 6.7 million people got TB treatment in 2022, a 1.4 million increase from the previous year. In addition, the Fund provided 220 million mosquito nets and supported in the treatment of 24.5 million HIV/AIDS patients.

However, the Fund noted in a statement issued with the results that climate change contributed to the difficulties of recovering from the outbreak.

The mosquito that transmits the malaria parasite, for example, may now thrive in cooler parts of Africa’s highlands. Natural catastrophes, such as floods, are placing a burden on healthcare systems throughout the globe, causing people to migrate, spreading illness, and interrupting treatment plans, according to the report. It also said that uncertain situations make reaching vulnerable people in countries such as Sudan, Ukraine, Afghanistan, and Myanmar very challenging.

But, according to Sands, there is still reason to be optimistic, owing in part to cutting-edge diagnostic and preventive measures. This week, the United Nations General Assembly will have a high-level debate on TB, giving advocates hope for more attention to the disease.

Categories
News

According to research, artificial intelligence called ‘Sense of Urgency’ aids clinicians in predicting their patients’ danger of passing away.

Researchers at OSF HealthCare want to make sure that patients have “important conversations” about their plans for the end of their lives.
Only 22% of Americans write down their end-of-life plans, according to study. A team at OSF HealthCare in Illinois is using artificial intelligence to help doctors figure out which patients are more likely to die during their hospital stay.

A news statement from OSF says that the team made an AI model that can predict a patient’s risk of dying between five and ninety days after being admitted to the hospital.

The goal is for the doctors to be able to talk to these people about important end-of-life issues.

In an interview with Fox News Digital, lead study author Dr. Jonathan Handler, an OSF HealthCare senior fellow of innovation, said, “It’s a goal of our organization that every single patient we serve would have their discussions about advanced care planning written down so that we could give them the care they want, especially at a sensitive time like the end of their life when they may not be able to talk to us because of their medical condition.”

If a patient is asleep or on a respirator, for example, it may be too late for them to tell their doctors what they want.
Handler said that in an ideal world, the mortality prediction would keep patients from dying before they got the full benefits of the hospice care they could have gotten if their goals had been written down sooner.

Since the average length of a hospital stay is four days, the researchers decided to start the model at five days and end it at 90 days to give a “sense of urgency,” as one researcher put it.

The AI model was tried on a set of data from more than 75,000 people of different races, cultures, genders, and social backgrounds.

The study, which was just released in the Journal of Medical Systems, showed that the death rate for all patients was 1 in 12.

But for people who the AI model said were more likely to die while they were in the hospital, the death rate went up to one in four, which is three times higher than the average.
The model was tried before and during the COVID-19 pandemic, and the results were almost the same, according to the study team.

Handler said that 13 different kinds of patient information were used to teach the patient death estimator how to work.

“That included clinical trends, like how well a patient’s organs are working, as well as how often and how intensely they’ve had to go to the health care system and other information, like their age,” he said.
Handler said that the model gives a doctor a chance, or “confidence level,” as well as an account of why the patient has a higher-than-normal chance of dying.

“At the end of the day, the AI takes a lot of information that would take a clinician a long time to gather, analyze, and summarize on their own, and then presents that information along with the prediction to allow the clinician to make a decision,” he said.
Handler said that a similar AI model made at NYU Langone gave the OSF researchers an idea of what they could do.

“They had made a death predictor for the first 60 days, which we tried to copy,” he said.

“We think our population is very different from theirs, so we used a different kind of predictor to get the results we wanted, and we were successful.”

“Then, the AI uses this information to figure out how likely it is that the patient will die in the next five to ninety days.”

The forecast “isn’t perfect,” Handler said. Just because it shows a higher risk of death doesn’t mean it will happen.

“But at the end of the day, the goal is to get the clinician to talk, even if the predictor is wrong,” he said.
“In the end, we want to do what the patient wants and give them the care they need at the end of life,” Handler said.
OSF is already using the AI tool because, as Handler said, the health care system “tried to integrate it as smoothly as possible into the clinicians’ workflow in a way that helps them.”

Handler said, “We are now in the process of optimizing the tool to make sure it has the most impact and helps patients and clinicians have a deep, meaningful, and thoughtful conversation.”

Expert on AI points out possible limits

Dr. Harvey Castro, a board-certified emergency medicine doctor in Dallas, Texas, and a national speaker on AI in health care, said that OSF’s model may have some benefits, but it may also have some risks and limits.

Possible fake results is one of them. “If the AI model wrongly predicts that a patient is at a high risk of dying when they are not, it could cause the patient and their family needless stress,” Castro said.
Castro also brought up the risk of false positives.

“If the AI model doesn’t find a patient who is at high risk of dying, important conversations about end-of-life care might be put off or never happen,” he said. “If this happens, the patient might not get the care they would have wanted in their last days.”

Castro said that other possible risks include relying too much on AI, worrying about data privacy, and the possibility of bias if the model is built on a small set of data. This could lead to different care advice for different patient groups.

The expert said that these kinds of models should be used with human contact.

“End-of-life conversations are difficult and can have big effects on a patient’s mind,” he said. “People who work in health care should use AI predictions along with a human touch.”

The expert said that these models need to be constantly checked and given feedback to make sure they are still accurate and useful in the real world.

“It is very important to study AI’s role in health care from an ethical point of view, especially when making predictions about life and death.”