Latest news with #accessibility


CNN
4 hours ago
- Politics
- CNN
Judge weighs push to require ASL interpreters at White House briefings
A federal judge grappled for over an hour on Wednesday with an effort to force the Trump administration to provide American Sign Language interpreters at White House press briefings. The case, brought by the National Association of the Deaf, alleges that, in failing to provide sufficient ASL interpretation, the White House is violating deaf Americans' rights under the Rehabilitation Act of 1973 from accessing 'critical information in real time.' US District Judge Amir Ali, one of former President Joe Biden's final appointees, did not immediately issue a ruling, but he appeared sympathetic to the group's arguments. Without live ASL interpretations readily available at White House briefings, NAD attorney Ian Hoffman argued, deaf Americans are 'deprived of their ability to participate in the democratic process.' The Biden administration had staffed all of its press briefings with qualified ASL interpreters, but that policy was discontinued by the Trump White House earlier this year. In court on Wednesday, the Justice Department argued that the current accessibility services offered by the administration — including live closed captions and written transcripts – are sufficient in providing the deaf community with 'meaningful access' to White House information. In briefings, the NAD had pushed back on this argument, asserting that ASL and English are distinct languages and that closed captioning is 'especially inaccessible to the many thousands of deaf persons fluent only in ASL.' Ali pressed Hedges about the utility of written transcriptions. 'How does it help to point to things that may not be adequate?' he said, asking why DOJ hadn't presented evidence to show that written means were sufficiently able to inform the deaf community. Hodges responded that the burden was on the plaintiffs to show that more thorough ASL translations were necessary and repeated her previous claim that the type of services provided should be at the discretion of the White House. The National Association for the Deaf also took aim at the first Trump administration in 2020 for its failure to provide ASL interpretation during important Covid-19 briefings. In that suit, a federal judge ordered the White House to provide in-frame videos of ASL interpreters during televised press events. In his ruling, US District Judge James Boasberg specifically clarified that written means such as transcripts and closed captions — the methods emphasized by the DOJ — 'may constitute a reasonable accommodation under some circumstances, but not here.' After Boasberg's order, the first Trump White House began providing ASL interpreters for all pandemic-related press events. When Biden took office in 2021, his administration expanded accessibility programs and began staffing all press briefings with ASL interpreters. But on the first day of his second administration, Trump halted the use of all ASL interpreters at White House briefings, prompting the lawsuit filed in May. The courtroom on Wednesday was flooded with members of the deaf community showing their support for the plaintiffs. ASL interpreters provided live translations throughout the duration of the nearly 90-minute hearing.


CNN
4 hours ago
- Politics
- CNN
Judge weighs push to require ASL interpreters at White House briefings
A federal judge grappled for over an hour on Wednesday with an effort to force the Trump administration to provide American Sign Language interpreters at White House press briefings. The case, brought by the National Association of the Deaf, alleges that, in failing to provide sufficient ASL interpretation, the White House is violating deaf Americans' rights under the Rehabilitation Act of 1973 from accessing 'critical information in real time.' US District Judge Amir Ali, one of former President Joe Biden's final appointees, did not immediately issue a ruling, but he appeared sympathetic to the group's arguments. Without live ASL interpretations readily available at White House briefings, NAD attorney Ian Hoffman argued, deaf Americans are 'deprived of their ability to participate in the democratic process.' The Biden administration had staffed all of its press briefings with qualified ASL interpreters, but that policy was discontinued by the Trump White House earlier this year. In court on Wednesday, the Justice Department argued that the current accessibility services offered by the administration — including live closed captions and written transcripts – are sufficient in providing the deaf community with 'meaningful access' to White House information. In briefings, the NAD had pushed back on this argument, asserting that ASL and English are distinct languages and that closed captioning is 'especially inaccessible to the many thousands of deaf persons fluent only in ASL.' Ali pressed Hedges about the utility of written transcriptions. 'How does it help to point to things that may not be adequate?' he said, asking why DOJ hadn't presented evidence to show that written means were sufficiently able to inform the deaf community. Hodges responded that the burden was on the plaintiffs to show that more thorough ASL translations were necessary and repeated her previous claim that the type of services provided should be at the discretion of the White House. The National Association for the Deaf also took aim at the first Trump administration in 2020 for its failure to provide ASL interpretation during important Covid-19 briefings. In that suit, a federal judge ordered the White House to provide in-frame videos of ASL interpreters during televised press events. In his ruling, US District Judge James Boasberg specifically clarified that written means such as transcripts and closed captions — the methods emphasized by the DOJ — 'may constitute a reasonable accommodation under some circumstances, but not here.' After Boasberg's order, the first Trump White House began providing ASL interpreters for all pandemic-related press events. When Biden took office in 2021, his administration expanded accessibility programs and began staffing all press briefings with ASL interpreters. But on the first day of his second administration, Trump halted the use of all ASL interpreters at White House briefings, prompting the lawsuit filed in May. The courtroom on Wednesday was flooded with members of the deaf community showing their support for the plaintiffs. ASL interpreters provided live translations throughout the duration of the nearly 90-minute hearing.


CTV News
8 hours ago
- General
- CTV News
Waterslide at Winnipeg community centre wading pool removed
The Dakota Community Centre's waterplay park is pictured without its waterslide on June 29, 2025. (CTV News) A waterslide at a Winnipeg community centre has been removed. In a statement, the City of Winnipeg said it recently completed a project at the Dakota Community Centre's waterplay park that involved repairs to the wading pool and removal of the waterslide due to accessibility issues. The city said it knows this removal will be disappointing to some people but noted that new spray features have been added that will provide 'new water play opportunities and foster an atmosphere of inclusion.' According to the city's website, wading pools will open for the season on a staggered basis beginning on July 1. CTV News has reached out to the city to find out about the new features at the wading pool.


WIRED
11 hours ago
- WIRED
These Transcribing Eyeglasses Put Subtitles on the World
TranscribeGlass can subtitle conversations in nearly real time and will soon be able to translate languages and tell you when the person you're talking to you is feeling socially awkward. PHOTOGRAPH: COURTESY OF TRANSCRIBE I knew the AI on these smart glasses worked pretty well once it told me that someone else in the conversation was being the socially awkward one. TranscribeGlass are smart eyeglasses that aim to do exactly what it says on the tin: transcribe spoken conversations and project subtitles onto the glass in front of your eyes. They're meant for the Deaf and, primarily, the hard-of-hearing community who struggle to read lips or pick out a conversation in a loud room. Most face computers are graceless and heavy, but these glasses are light, only 36 grams. TranscribeGlass is able to keep the weight off by relegating most of the main computing features to a companion app (iOS only for now). There are no cameras, microphones, or speakers in the frames, just a small waveguide projector in the rim of one eye that beams a 640 x 480p image onto the glass. That is just enough resolution for text to be legible when it is projected directly into your vision, subtitling the conversations picked up by the mic in your phone. In the app, subtitles can be moved around in the wearer's vision, anywhere within a 30-degree field of view. You can change the settings to adjust how many lines of text come in at a time, dialing up to a wall of text and down to one word at a time. The battery in the glasses should last around eight hours between charges. The frames cost around $377, and there's an additional $20-per-month subscription fee to access the transcription service. Subtitles are currently available in the glasses, but Madhav Lavakare, the 24-year-old founder of TranscribeGlass, has other features lined up. In the testing phase are a setting to translate languages in real time and one to analyze the tone of voice of the person talking. Glass Dismissed As Lavakare told me (and The New Yorker in April), he envisioned the idea for this product after wanting to help a hard-of-hearing friend engage in conversations that were not happening with his needs in mind. Lavakare, who is a senior at Yale University, figured glasses were the way to go. If he could just get them right. And, you know, make them look cooler than some other glasses out there. 'I was pretty obsessed with Google Glass when it came out,' Lavakare says. 'Oh,' I say. 'So you were a Glasshole?' 'I was, I was!' he says with a laugh. 'And then I was like, why are people calling me that?' While we are talking, the words pop up onto the screen of the glasses I'm wearing. They show up in a Matrix -y green font that patters out across my vision. It does a pretty good job of transcribing the conversation, though it does split the word 'Glasshole' into 'Glass Hole,' which is honestly funnier. Though Lavakare's smart glasses are much more normal-glasses-adjacent than Google Glass ever was, they still can't really help but look like smart glasses. The screen has a slight shimmer where the waveguides sit on the glass that is just visible enough to onlookers and is clearly noticeable to me when I am wearing them. Aside from those minor gripes, the service itself works almost eerily well. At a bustling coworking space in San Francisco with many conversations happening around us, Lavakare and Nirbhay Narang, Transcribe's CTO, talked to me while I wore the glasses. Most of the transcriptions were grammatically correct and were labeled with different speaker titles to make it clear who was talking. It all works so fast and so well, in fact, that the words popped up so quickly that I had trouble reading them as the conversation went on and new lines of text appeared almost simultaneously. The transcriptions are also sometimes a little grainy and hard to focus on at the moment. Still, with a little practice, it's hard not to see how this would be extremely useful for people who are hard of hearing. TranscribeGlass has a few competitors. Companies like Even realities and XRAI make glasses that look flashier and offer more features, like turn-by-turn directions and chatbot interaction. But Lavakare says the limited functionality is what makes his spectacles special. 'All these smart glasses exist, but no one's found a great use case for them,' Lavakare says. 'We think we've really found a use case that's just insanely valuable to the end user.' While he says these glasses can't play music or use AI to answer questions, they only really need to do one thing well to get people to wear them: help them understand what is being said around them. Lavakare likens that feeling of missing out on a conversation happening around you to a kind of social isolation. That said, he does hope to pack other conversational features into the glasses, with the goal of enhancing what you can glean from the subtext of a chat. One upcoming feature is language translation. Narang and I have a short conversation to test the translation abilities. He speaks to me in Hindi while I speak to him in English. On my glasses, I see whatever he's speaking to me translated into English on my screen. When I respond in English, the Hindi text pops up on his phone app. It's a service that also seems to work well enough, though some words are mistranslated. That's why the feature hasn't yet come to the few hundred customers TranscribeGlass has now. IMAGE COURTESY OF TRANSCRIBE More Features to Come There are other features in the works. Lavakare wants to let users have the option of translating a spoken language into something more like the syntax used in a visual language such as American Sign Language, which tends to have a different order of nouns, verbs, and tense than spoken English. Trusting that translation to AI, when most Deaf people can and do already read in English just fine, could cause some inaccuracies or misinterpretations. Lavakare acknowledges that potential for error, noting that he has talked with Deaf educators at the American School for the Deaf to try to get it right. 'Sign language grammar is actually very different than English grammar,' Lavakare says. 'That's why this is still experimental.' He's also testing an even more dubious capability—recognizing the emotion of a speaker based on tone of voice alone. Emotion tracking is a fraught topic in the AI space, albeit one people just can't seem to help putting into smart glasses. While TranscribeGlass hasn't released the ability to catalog emotions during a conversation, the team is testing it with the goal of releasing soon. It makes sense for helping with conversational comprehension, given that detecting how a person says something is often as important as knowing what they say. Lavakare lets me test it out, switching on the feature while I'm wearing the glasses. 'Watch this,' he says. Then, 'Hey Boone, how's it going?' His words pop up on the screen. I start to answer, and then a dialog tag appears with the emotion attached to his words: [Awkwardness]. I laugh and say, 'Oh no, are we that awkward?' Then the tag pops up on my words: [Amused]. Now my words have my name next to it, which the platform had picked up when Lavakare said it earlier. As soon as I finish talking, it changes my dialog tag to [Awkwardness] . Well. Maybe this thing does work.


GSM Arena
12 hours ago
- GSM Arena
GSMArena labs: the viewer for 3D phone models is now easier to control, accessibility improved
Five years ago we partnered with Binkies 3D to bring you 3D models of the most popular smartphones, then a couple of years ago the 3D viewer gained a side-by-side comparison mode that allows you to size up different models. The latest update improves the usability and accessibility of the viewer. After you spin the 3D model of a phone, it will now snap to 45° angles horizontally and 90° vertically – this way, you can easily 'pose' the phone with a quick swipe. You can still view a phone at any angle while you move your finger/mouse cursor, the snapping only occurs once you let go. Here, try it: Apple iPhone 16 Pro Max Google Pixel 9 Pro The ways you interact with the 3D model were improved too. You can use the keyboard to control the interface. And if you are on mobile, you can swipe across the phone to spin it, but up/down swipes scroll the page so you can't get stuck. The Binkies team implemented a number of changes to improve accessibility. All visual elements have ARIA labels, which are used by screen reading software. Additionally, colors were tweaked to ensure that they have good contrast for visibility and the slightly transparent background of the full screen view is now opaque, again to boost visibility. The 3D viewer is now compatible with the European Accessibility Act. You can find 3D views of many popular smartphones in our database by clicking on Pictures in their specs pages. There you will see both official images and our own photos, plus the Binkies 3D viewer. The 3D models for each phone are available in several colors too to help you pick out your favorite. If you want to see two phones side-by-side, there is a 'Size Up' button that shows up when comparing different models.