
Inside most secret radioactive site in US said to be 'Chernobyl in making'
A huge radioactive site which is planted on almost 600 acres of desert land has been compared to an "underground Chernobyl" with warnings that it is a disaster waiting to happen.
The Hanford Site in Washington, US, was constructed during World War II and is known as one of the country's most radioactive chemical contamination sites. The area was built as part of the Manhattan Project, where workers made plutonium to put together an explosive which was eventually set off in Nagasaki, Japan, on August 9, 1945. According to experts, the site's most hazardous waste is hidden in the tanks and unlined trenches. Washington's Department of Ecology has warned that there are 177 leaky storage tanks buried on the property, holding 56 million gallons of radioactive waste.
Now, the site has been proposed as a potential location for AI development by the Department of Energy. Richland has been included on the list as it is home to the Department of Energy's Pacific Northwest National Laboratory, which could result in a partnership with developers on advanced hardware for next-generation data centres and power systems needed to run them.
The 295 acres in Richland, however, is former Hanford nuclear site land, which was transferred to the Department via the Tri-City Development Council to the city of Richland in 2015 to be developed as part of a new Advanced Clean Energy Park. But the land has a long, deadly, history.
The toxic facility was so dangerous that it was nicknamed "death mile" in 1985 after local farmers were being diagnosed with cancers. The string of incidents was linked back to the residents breathing in the chemical, Iodine 131. Iodine 131 is used in medicine to treat thyroid cancer and hyperthyroidism.
Government officials gave locals a mere 30 days to leave the area, with residents being paid for their land. However, indigenous tribes weren't given any compensation.
The War Powers Act which was put in place prior to its 1973 resolution allowed the government to use land for military purposes. Under the act, President Franklin Delano Roosevelt acquired over 600 square miles of land.
The shock request saw up to 55,000 men and women taken to the area to start work on the undercover development. A lot of them knew very little about what they were working on, with most being unaware it was for the war. It was reported that only 5 per cent actually knew why they were hired.
The secret workers were given access to almost ten dining halls, a hospital, a post office, barbershops and a cinema. The Department of Ecology revealed that they were also given entry into dance halls and bowling alleys to keep them entertained outside of working hours.
It was reported that its first full-scale plutonium production reactor was finalised within 13 months. The operation was launched in 1944. Physicist Leona Libby, 23, and her team created the first nuclear chain reaction which later helped create the bomb.
Once the explosive was released, the workers were given further insight into the circumstances behind the project. After WWII had finished, the production site continued its operation during the Cold War. In 1963, the last reactor was constructed.
When the site was in use, more than 400 billion gallons of contaminated liquids were dumped into the ground. According to the Department of Ecology, that dumped material then came into contact with groundwater and even reached parts of the Columbia River. The Department of Ecology was launched in 1970 and assessed any concerns surrounding the site. By 1989, it was forced to close its doors due to a potential red flag which was highlighted in 1987. Despite, discussions about clean-up efforts taking place, the Hanford Site is still viewed as an environmental concern.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Independent
an hour ago
- The Independent
New AI tool could speed up skin cancer diagnoses in remote parts of world
A researcher at a Scottish university has developed AI tools that could give people in remote areas of the world access to fast and potentially life-saving skin cancer diagnoses. Tess Watt, the PhD student at Heriot-Watt University in Edinburgh who led the project to develop the technology, said it is intended to enable early detection of skin conditions anywhere in the world, and without the need for direct access to dermatologists. The technology also works without internet access. The system involves a patient taking a photograph of their skin complaint using a small camera attached to a Raspberry Pi device – a cheap, energy-efficient handheld computer that is capable of storing vast amounts of information. The photograph is analysed in real-time using the latest state-of-the-art image classification, comparing it to an enormous dataset of thousands of images stored on the device to reach a diagnosis. The findings are then shared with a local GP service to begin a suitable treatment plan. The project is understood to be the first of its kind to combine AI medical diagnosis with the aim of serving remote communities. Ms Watt explained: ' Healthcare from home is a really important topic at the moment, especially as GP wait times continue to grow. 'If we can empower people to monitor skin conditions from their own homes using AI, we can dramatically reduce delays in diagnosis.' A prototype of the device has already been demonstrated at Heriot-Watt's advanced health and care technologies suite. The research team said the tool is up to 85% accurate in its diagnostic capabilities, but they hope to increase this further by gaining access to more skin lesion datasets, aided by advanced machine tools. Ms Watt is also in talks with NHS Scotland to begin the ethical approval process for testing the technology in real-world clinical settings. 'Hopefully in the next year or two, we'll have a pilot project under way,' she said, noting medical technology often takes years to move from prototype to implementation. She added: 'By the time I finish my PhD, three years from now, I'd love to see something well into the pipeline that's on its way to real-world use.' The university said the long-term vision is to roll the system out first across remote regions of Scotland, before expanding to global areas with limited access to dermatological care. It added the technology could also offer vital support to patients who are infirm or unable to travel, allowing loved ones to assist with capturing and submitting diagnostic images to GPs. Ms Watt's academic supervisor, Dr Christos Chrysoulas, said: 'E-health devices must be engineered to operate independently of external connectivity to ensure continuity of patient service and safety. 'In the event of a network or cloud service failure, such devices must fail safely and maintain all essential clinical operations without functional degradation. 'While auxiliary or non-critical features may become temporarily unavailable, the core diagnostic and even therapeutic capabilities must remain fully operational, in compliance of course with safety and regulatory requirements. 'Ensuring this level of resilience in affordable, low-cost medical devices is the essence of our research, particularly for deployment in resource-limited settings and areas with limited or no connectivity, where uninterrupted patient care must still be guaranteed.' UK Science and Technology Secretary Peter Kyle commented on the research, saying: 'Low-cost technology which could help detect skin cancer early and at home, without even the need for internet access, is an incredible example of AI's potential to break down barriers in healthcare and save lives. 'Promising, first of its kind research like this also demonstrates the crucial role UK innovators can play in improving the lives of people of all backgrounds, wherever they live, and makes clear the value of government investing in research to deliver our plan for change.'


Evening Standard
2 hours ago
- Evening Standard
New AI tool could speed up skin cancer diagnoses in remote parts of world
UK Science and Technology Secretary Peter Kyle commented on the research, saying: 'Low-cost technology which could help detect skin cancer early and at home, without even the need for internet access, is an incredible example of AI's potential to break down barriers in healthcare and save lives.


The Guardian
9 hours ago
- The Guardian
Scientific publishing needs urgent reform to retain trust in research process
The dysfunctions of scientific publishing that your article so aptly captured derive from two forces (Quality of scientific papers questioned as academics 'overwhelmed' by the millions published, 13 July) – researchers are incentivised to publish as much as possible and publishers make more money if they publish more papers. Artificial intelligence will not fix this. Churning out more papers faster has got us to this place. Given current incentives, AI will mean churning them out even faster. A paper written by AI, peer-reviewed by AI and read only by AI creates a self-reinforcing loop that holds no real value, erodes trust in science and voids scientific inquiry of meaning. Research is driven by our wonder at the world. That needs to be central to any reform of scientific publishing. Instead, the driving forces can be addressed by two measures. Incentives for researchers can and should prioritise quality over quantity, and meaning over metrics. And publishers' extortionate fees (fuelling profits of more than 30%) can and should be refused by those who pay them. Both the incentives and publishers' contracts are governed by the funders of research – universities, research councils and foundations. Their welcome attempts to engage with these problems through Plan S, which aims to make research publications open access, have not succeeded because these have been captured by publishers that twisted them to their advantage, making yet more profits. There are examples, often beyond the global north, of scientific publishing that is not geared towards generating profits for publishers. SciELO (which is centred on Latin America) is one, and the Global Diamond Open Access Alliance champions many others. We have much to learn from them. Research is in a parlous state in the English-speaking world – at risk for the truths it tells in the US, and for its expense in Britain. Funders have the power radically to alter the incentives scientists face and to lower the rents extracted by BrockingtonIcrea (Catalan Institution for Research and Advanced Studies)Paolo CrosettoGrenoble Applied Economics LaboratoryPablo Gomez BarreiroScience services and laboratories, Kew Gardens Your article on the overwhelming volume of scientific papers rightly highlights a system under pressure. But the deeper dysfunction lies not only in quantity, but in the economics of scholarly publishing, where publishers cash in on researchers' dependence on journals for academic careers. The academic publishing market systematically diverts public research funds into shareholder profits. Open access was meant to democratise knowledge, but its original vision has been co-opted by commercial publishers. It was BioMed Central (now Springer-Nature) that first introduced the 'author pays' model to secure revenue streams. With article processing charges (APCs) now being the dominant open-access model, authors routinely pay between £2,000 and £10,000 to publish a single article, even if the cost of producing it does not exceed £1,000. Some of us attended the recent Royal Society conference on the future of scientific publishing, where its vice-president, Sir Mark Walport, reminded the audience that academic publishing isn't free and that if we want to remove paywalls for both authors and readers, someone must pay the bills. We argue that there is already enough money in the system, which allows leading publishers such as Elsevier to generate profit margins of 38%. Our most recent estimates show that researchers paid close to $9bn in APCs to six publishers in 2019-23, with annual amounts nearly tripling in these five years. These most recent estimates far exceed the $1bn estimated for 2015-18 that your article cites. As further emphasised at the Royal Society meeting, publishers monetise the current role that journal prestige plays in hiring, promotion and funding. Therefore, in order to make open access sustainable and to put a stop to these extractive business practices, it is essential to reform academic assessment and decouple it from knowledge HausteinAssociate Professor, School of Information Studies, University of Ottawa; Co-director, Scholarly Communications LabEric ScharesEngineering and collection analysis librarian, University Library, Iowa State UniversityLeigh-Ann ButlerScholarly communication librarian, University of OttawaJuan Pablo Alperin Associate professor, School of Publishing, Simon Fraser University; Scientific director, Public Knowledge Project Academic publishing is creaking at the seams. Too many articles are published and too many journals don't add real value. Researchers are incentivised to publish quantity over quality, and some journal publishers benefit from this. This detracts from the excellent, world-changing and increasingly open-access research that we all need to flourish – and that quality publishers cultivate. Generative AI only scales up these pressures, as your article shows. Something has to change. That's why Cambridge University Press has spent the last few months collaborating with researchers, librarians, publishers, funders and learned societies across the globe on a radical and pragmatic review of the open research publishing ecosystem, which we will publish in the autumn. Focusing on generative AI or on low-quality journals alone is insufficient. We need a system-wide approach that reviews and rethinks the link between publishing, reward and recognition; equity in research dissemination; research integrity; and one that takes technological change seriously. The system is about to break. We need creative thinking and commitment from all players to fix it and to build something HillManaging director, Cambridge University Press