Technology

106 readers
106 users here now

Share interesting Technology news and links.

Rules:

  1. No paywalled sites at all.
  2. News articles has to be recent, not older than 2 weeks (14 days).
  3. No videos.
  4. Post only direct links.

To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:

Encouraged:

founded 3 weeks ago
MODERATORS
26
 
 
27
28
29
30
31
 
 

"The phrase artificial intelligence is a marketing term that is used to sprinkle some magic fairy dust that brings the venture capital dollars."

32
33
34
35
36
 
 

Of course, power demand is set to continue expanding rapidly as the supply chain increases its production capacity while demand remains high. TSMC has already confirmed its target to double its CoWoS capacity again in 2025 (see Data S1, sheet 2). This could mean the total power demand associated with devices produced using TSMC’s CoWoS capacity will also double from 2024 to 2025—just as it did from 2023 to 2024 (Figure 1), when TSMC similarly doubled its CoWoS capacity. At this rate, the cumulative power demand of AI accelerator modules produced in 2023, 2024, and 2025 could reach 12.8 GW by the end of 2025. For AI systems, this figure would rise to 23 GW, surpassing the electricity consumption of Bitcoin mining and approaching half of total data center electricity consumption (excluding crypto mining) in 2024. However, with the industry transitioning from CoWoS-S to CoWoS-L as the main packaging technology for AI accelerators, continued suboptimal yield rates for this new packaging technology may slow down both device production and the total power demand associated with these devices. Moreover, although demand for TSMC’s CoWoS capacity exceeded supply in both 2023 and 2024, it is not guaranteed that this trend will persist throughout 2025. Several factors could lead to a slowdown in AI hardware demand, such as waning enthusiasm for AI applications. Additionally, AI hardware may face new bottlenecks in the manufacturing and deployment process. While limited CoWoS capacity has constrained AI accelerator production and power demand over the past 2 years, export controls and sanctions driven by geopolitical tensions could introduce new disruptions in the AI hardware supply chain. Chinese companies have already faced restrictions on the type of AI hardware they can import, leading to the notable release of Chinese tech company DeepSeek’s R1 model. This large language model may achieve performance comparable to that of OpenAI’s ChatGPT, but it was claimed to do so using less advanced hardware and innovative software. These innovations can reduce the computational and energy costs of AI. At the same time, this does not necessarily change the “bigger is better” dynamic that has driven AI models to unprecedented sizes in recent years. Any positive effects on AI power demand as a result of efficiency gains may be negated by rebound effects, such as incentivizing greater use and the use of more computational resources to improve performance. Furthermore, multiple regions attempting to develop their own AI solutions may, paradoxically, increase overall AI hardware demand. Tech companies may also struggle to deploy AI hardware, given that Google already faced a “power capacity crisis” while attempting to expand data center capacity. For now, researchers will have to continue navigating limited data availability to determine what TSMC’s expanding CoWoS capacity means for the future power demand of AI.

37
38
39
40
41
 
 

AI-generated child sexual abuse material (CSAM) carries unique harms. When generated from a photo of a clothed person, it can damage that person’s reputation and cause serious distress. When based on existing CSAM, it risks re-traumatizing victims. Even AI CSAM that seems purely synthetic may come from a model that was trained on real abusive material. Many experts also warn that viewing AI CSAM can normalize child abuse and increase the risk of contact abuse. There is the added risk that law enforcement may mistake AI CSAM for content involving a real, unidentified victim, leading to wasted time and resources spent trying to locate a child who does not exist.

In this report we aim to understand how educators, platform staff, law enforcement officers, U.S. legislators, and victims are thinking about and responding to AI CSAM. We interviewed 52 people, analyzed documents from four public school districts, and coded state legislation.

Our main findings are that while the prevalence of student-on-student nudify app use in schools is unclear, schools are generally not addressing the risks of nudify apps with students, and some schools that have had a nudify incident have made missteps in their response. We additionally find that mainstream platforms report the CSAM they discover, but, for various reasons, without systematically trying to discern and convey whether it is AI-generated in their reports to the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline. This means the task of identifying AI-generated material falls to NCMEC and law enforcement. However, frontline platform staff believe the prevalence of AI CSAM on their platforms remains low. Finally, we find that legal risk is hindering CSAM red teaming efforts for mainstream AI model-building companies.

42
 
 
  • Brazil is testing a digital wallet program that allows users to monetize their data.
  • A federal bill, when passed, would turn data into commercial assets for citizens — the first such proposal in the world.
  • The pilot, a partnership between the public and private sectors, is ahead of similar initiatives in some U.S. states.
43
44
45
 
 

Today, I am announcing a new visa restriction policy that will apply to foreign nationals who are responsible for censorship of protected expression in the United States. It is unacceptable for foreign officials to issue or threaten arrest warrants on U.S. citizens or U.S. residents for social media posts on American platforms while physically present on U.S. soil. It is similarly unacceptable for foreign officials to demand that American tech platforms adopt global content moderation policies or engage in censorship activity that reaches beyond their authority and into the United States. We will not tolerate encroachments upon American sovereignty, especially when such encroachments undermine the exercise of our fundamental right to free speech.

46
47
48
49
50
 
 
  • Nick Clegg, former Meta executive and UK Deputy Prime Minister, has reiterated a familiar line when it comes to AI and artist consent.
  • He said that any push for consent would “basically kill” the AI industry.
  • Clegg added that the sheer volume of data that AI is trained on makes it “implausible” to ask for consent.
view more: ‹ prev next ›