logo
#

Latest news with #Weizenbaum

School districts, teachers unions sue over Trump's freeze on education funding
School districts, teachers unions sue over Trump's freeze on education funding

Boston Globe

time22-07-2025

  • Politics
  • Boston Globe

School districts, teachers unions sue over Trump's freeze on education funding

Related : The new case comes as the US Department of Education has agreed to release about $1.3 billion in funding for after-school and summer programming, out of $6.8 billion withheld. No decision has been made yet about the rest of the money, a notice to states on Friday said. Advertisement The money released is for 21st Century Community Learning Centers, which include nonprofits such as the Boys & Girls Clubs that serve high-poverty, low-performing districts with after-school and summer programs. The frozen funds prompted alarm that those programs would have to shut down or significantly scale back in the coming weeks if the money remained frozen. Get Rhode Map A weekday briefing from veteran Rhode Island reporters, focused on the things that matter most in the Ocean State. Enter Email Sign Up The funds were released after 10 Rhode Island Education Commissioner Angélica Infante-Green said the state received $6.5 million from the released after-school funds, but the Department of Education 'hasn't provided any information about when — or if — the remaining Congressionally allocated education funds will be released.' Advertisement Rhode Island had 'Unnecessary delays and cuts to education funding for students are irresponsible,' Infante-Green said. 'Students and teachers in every school district in Rhode Island will be negatively affected.' The funds that remain withheld support the The Department of Education and Office of Management and Budget did not immediately comment on the new lawsuit. But OMB previously said it was withholding the funds, which are typically disbursed on July 1, in order to review whether the programs were spreading a 'radical leftwing agenda' including support for undocumented immigrants. Miriam Weizenbaum, the attorney for the plaintiffs, said the administration would have to follow the appropriate federal procedures to seek to withhold money for that reason, which wasn't done here. 'You get more notice and opportunity to be heard with a speeding ticket,' Weizenbaum said. The new lawsuit said the 'uncertainty' about the funds is 'causing significant anxiety and confusion among the Teachers Unions' members right before the start of the school year.' 'The Teachers Unions are under intense stress and pressure to help members determine exactly how their jobs will be affected,' the suit said. 'Some members will be scrambling to find new jobs.' If cuts take place, class sizes could grow, the lawsuit said, making it 'more difficult for teachers to effectively perform their jobs' and harder for districts to retain teachers. The plaintiffs include the Anchorage School District in Alaska, the largest district in that state, along with two other Alaskan districts, the Cincinnati Public Schools, and large teachers unions in California, Pennsylvania, Florida, New York, Rhode Island, Illinois, Ohio, and Texas. Advertisement Weizenbaum, who was previously a top litigator in the Rhode Island Attorney General's Office, said it's unclear if the judge in the separate case would making a ruling that affects all 50 states, or just the 23 states who sued, which would leave out some of the districts in the new case. She said teachers unions also wanted to bring the separate suit in order to make sure their experience of what the funding cuts will bring is heard before the court. 'Their on-the-ground perspective needs to be before a court,' Weizenbaum said. 'This is a big hit across the country at all levels.' Maribeth Calabro, the president of the Rhode Island Federation of Teachers, wrote in a declaration attached to the lawsuit that a wide range of jobs in Providence are funded with the withheld money, including instructional coaches, social workers, and behavioral specialists. The withheld money 'has created widespread uncertainty about staffing levels, student support services, and professional development availability for the upcoming school year,' Calabro wrote. She said Rhode Island's ability to teach the science of reading could be in jeopardy, along with Providence's compliance with a US Department of Justice settlement over properly teaching English to multilingual learners. The group asked for a preliminary injunction to release the funding as the case is heard. A hearing date has not yet been set. Steph Machado can be reached at

Remembering Eliza, one of the first chatbots: Lessons, warnings it holds for AI today
Remembering Eliza, one of the first chatbots: Lessons, warnings it holds for AI today

Indian Express

time02-06-2025

  • Science
  • Indian Express

Remembering Eliza, one of the first chatbots: Lessons, warnings it holds for AI today

In 1966, at a lab at the Massachusetts Institute of Technology (MIT), computer scientist Joseph Weizenbaum unveiled one of the first chatbots in history: Eliza. It ran on a computer that was among the most advanced at MIT at the time — the IBM 7090 — and could be accessed through a typewriter-like terminal. Eliza had different 'scripts' — or ways of interacting — and could mimic a math teacher, poetry teacher or a quiz master, among other things. But its most famous script was called DOCTOR, which emulated a therapist. Weizenbaum would later write about the anthropomorphisation of ELIZA, which, in his own words, led him to 'attach new importance to questions of the relationship between the individual and the computer'. Eventually, the myth-making around it reached such an extent that the tendency or pattern to attribute human qualities to computers came to be known as the ELIZA effect. The scientist too later spoke about the excessive reliance on computers, and would argue that no matter how impressive the machines seemed, what they pulled off could not amount to real understanding. These concerns, and the debates that followed, still matter today as we navigate a world with rapidly developing Artificial Intelligence (AI) tools. Weizenbaum was Jewish and fled Nazi Germany with his parents, arriving in the United States in the mid-1930s. In 1955, Weizenbaum was part of a team at American conglomerate General Electric that automated some key banking operations for the first time. He also developed a programming language called SLIP or 'Symmetric Lisp Processor'. This was part of an approach that worked with sentences, instead of numbers as computing had done until then. Weizenbaum was invited to join MIT's Project MAC, a Computer Science lab. Among other things, it was the first to build an interactive time-sharing system, where multiple users could use a single computer system simultaneously. Weizenbaum built ELIZA at the New England university between 1964-66. What Eliza did In the introduction to Computer Power and Human Reason: From Judgment to Calculation (1976), Weizenbaum breaks down how Eliza works in extremely simple terms. 'I composed a computer program with which one could converse in English. The human conversationalist partner would type his portion of the conversation on a typewriter connected to a computer, and the computer under control of my program, would analyse the message that had been transmitted to it, compose a response to it in English, and cause the response to be typed on the computer's typewriter,' Weizenbaum wrote. He explains that the program consisted of two steps: a language analyser and a script. A script, he describes, is a set of 'rules rather like those that might be given to an actor who is to use them to improvise around a certain theme. Thus, Eliza could be given a script to enable it to maintain a conversation — play a specific conversational rule.' Each time an audience member types something, ELIZA could examine it, looking for keywords that have entries in the currently active script. These words were then ranked and substituted, resulting in a response. One interaction of the DOCTOR script, which was that of a psychotherapist, was reproduced by Weizenbaum in a paper on ELIZA. It goes as following, with the text in all caps being ELIZA's responses: 'Men are all alike. IN WHAT WAY They're always bugging us about something or other. CAN YOU THINK (: A SPECIFIC EXAMPLE) Well, my boyfriend made me come here. YOUR BOYFRIEND MADE YOU COME HERE He says I am depressed much of the time. I AM SORRY TO HEAR YOU ARE DEPRESSED….' DOCTOR ended up becoming ELIZA in popular perception. A number of practicing psychiatrists seriously believed the DOCTOR script could grow into a nearly completely automatic form of psychotherapy, Weizenbaum wrote. 'If the method proves beneficial, then it would provide a therapeutic tool which can be made widely available to mental hospitals and psychiatric centers suffering a shortage of therapists,' one therapist wrote at the time. Weizenbaum also documents an incident with his secretary who started conversing with ELIZA in the DOCTOR script. 'After only a few interchanges with it, she asked me to leave the room. Another time, I suggested I might rig the system so that I could examine all conversations anyone had had with it, say, overnight. I was promptly bombarded with accusations that what I proposed amounted to spying on people's most intimate thoughts; clear evidence that people were conversing with the computer as if it were a person…' Weizenbaum wrote that he had not realised that 'extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people'. Eliza reimagined With Computer Science evolving rapidly, the code that constituted ELIZA was never published and reproduced. And the original code was only discovered in 2021 among a stack of Weizenbaum's papers. It had to be copied by hand by Stanford professor Jeff Shrager who now works on a digital archival project of Eliza along with a team of multi-disciplinary academics across the world. What it means today It is critical in Computer Science history as it was the first to demonstrate the Turing test (how human-like a machine's responses are) of a machine replicating human language. Of course, it also set off the obsession with getting computers to talk and interact with us, leading us to this moment in history where we are able to generate personalised videos, images and text at the drop of a hat. Digital Humanities professor David Berry at the University of Sussex, who is part of the digital archiving project along with Shrager, tells The Indian Express that 'ELIZA is is a 420-line program written in an obscure programming language which is radically different from the LLMs (large language models) like ChatGPT, a gigantic system with billion of parameters'. 'Eliza can run on any computer today and consume hardly any electricity, whereas ChatGPT consumes vast quantities of power,' Berry said. The contemporary LLMs, which are powered by huge data centres, require 0.14 kilowatt-hours (kWh) of electricity, equal to powering 14 LED light bulbs for 1 hour, as per calculations by The Washington Post. Berry also talks about how ELIZA 'offered a crucial early warning about human susceptibility to computational deception'. He adds that 'examining ELIZA's source code helped to demonstrate that convincing human-computer interaction does not require genuine comprehension, rather, it can emerge from clever pattern matching and careful interface design that exploits human cognitive biases'. 'Even modern large language models, despite their impressive capabilities, fundamentally operate through statistical pattern recognition rather than genuine understanding,' Berry says.

Does ELIZA, the first chatbot created 60 years ago, hold lessons for modern AI?
Does ELIZA, the first chatbot created 60 years ago, hold lessons for modern AI?

CBC

time18-02-2025

  • Entertainment
  • CBC

Does ELIZA, the first chatbot created 60 years ago, hold lessons for modern AI?

Scientists recently resurrected the original computer code for ELIZA, considered the world's first chatbot, and thought by many to be the first use of artificial intelligence (AI). Its original creator, however, was adamant the program was not intelligent at all. ELIZA was a simple program created in the 1960s by Joseph Weizenbaum, a professor at the Massachusetts Institute of Technology who was interested in human-computer interfaces. He designed ELIZA to be a generalized conversation program to make a natural language conversation with a computer possible. The program could run scripts with different language rules to represent individual conversational personas. Weizenbaum named this early natural language processing program ELIZA after Eliza Doolittle in the 1912 George Bernard Shaw play Pygmalion, where a commoner learns to speak with an upper-class English diction despite her working class roots. "Like the Eliza of Pygmalion fame, it can be made to appear even more civilized," Weizenbaum wrote in his 1966 paper in the journal Communications of the ACM. He wrote the first and most well-known script to run on ELIZA, called "DOCTOR," to mimic a first interview between a patient and psychotherapist. It responded to inputs from the user by selecting key words and spinning them back in a friendly way. For example, here is part of the most famous transcript that Weizenbaum described in his 1966 paper. It's an exchange between the program and a young woman after initially prompting her with, "PLEASE TELL ME YOUR PROBLEM." User: "Men are all alike." ELIZA: "IN WHAT WAY" User: "They're always bugging us about something or other." ELIZA: "CAN YOU THINK OF A SPECIFIC EXAMPLE" User: "Well, my boyfriend made me come here." ELIZA "YOUR BOYFRIEND MADE YOU COME HERE" User: "He says I'm depressed much of the time." ELIZA: "I'M SORRY TO HEAR YOU ARE DEPRESSED" The world's first 'chatbot' was a hit Unlike chatbots today that use large language models and train on massive language databases to generate responses, ELIZA — developed three decades before the term "chatbot" even existed — had little data to rely on. The DOCTOR script would search for keywords in the person's reply and mirror it back to them in its response. If no keyword was detected, the script would simply offer a generic response. ELIZA was programmed to prompt the user to continue the conversation. That made it seem like it was a good listener, which is important in psychotherapy, and what made the program a huge sensation. During the late 1960s and early '70s, before computers shrunk for individual use, they were large, and quite costly mainframe systems. So ELIZA seemed miraculous, with students believing the machine could think like a human and understand their problems. At the time, Weizenbaum described the response to ELIZA as "a striking form of Turing's test," where a user cannot tell whether responses are coming from a machine or a real person. I had the privilege of meeting Joseph Weizenbaum in the early '80s. He told me, "The program totally backfired. People thought ELIZA was intelligent, they were confiding in the machine, revealing personal issues they would not tell anyone else. Even my secretary asked me to leave the room so she could be alone with the computer. They called me a genius for creating it, but I kept telling them that the computer was not thinking at all." Later, Weizembaum wrote a book called Computer Power and Human Reason: From Judgment to Calculation in which he emphasized that computers, as clever and capable as they may become, do not think like humans and should never replace humans in roles such as doctors, teachers or scientists. He disliked the term "artificial intelligence," believing that humans are always necessary and computers should never be allowed to make important decisions. Reanimating defunct code For nearly 60 years, AI historians thought the original 420-line computer code for ELIZA and the famous DOCTOR script were lost. But in 2021, two software sleuths found the original printouts of code in a dusty box of Weizenbaum's archives at MIT. Those software scientists, among others, wrote in a paper that has yet to be peer-reviewed that they figured the only way to see if the code worked was to try it — a task made even more difficult given that the defunct code was written for a computer and operating system that no longer existed. Back in the 1960s, MIT had an IBM 7094, an early transistorized computer loaded with 32 kilobytes of user memory. At the time, it was one of the biggest and fastest computers available. The operating system developed for it was called the Compatible Time-Sharing System (CTSS). It was also the world's first time sharing system — meaning that it could support around 30 users at once. To resurrect the original ELIZA program with its DOCTOR script, the researchers used a restored CTSS operating system on hardware and software designed to emulate the original IBM 7094. On Dec 31, 2024, they brought ELIZA back to life and tested it by recreating the "Men are all alike" conversation. The revived version, adapted to work on modern systems, is available here for anyone to try out. Weizenbaum's legacy lives on in Germany at the Weizenbaum Institute, dedicated to the critical exploration and constructive shaping of digitization for the benefit of society. Today, AI is a powerful new tool that is having a profound influence on science, medicine, academics and culture. It's also growing at an astounding rate. This growth comes with a very real fear factor helped along by Hollywood with the likes of the Terminator film series or War Games, a 1983 film where computers try to eliminate humanity — and more recently, ominous warnings from AI industry insiders. This past week, government leaders, executives, and experts from over 100 countries met in Paris for the Paris Artificial Intelligence Action Summit, to discuss the future of AI with a focus on how to keep it both accessible and safe as the technology continues to develop at breakneck speed.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store