Allgemein – Hadamard https://hadamard.com/c Hadamard News Tue, 18 Feb 2025 05:07:46 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://hadamard.com/c/wp-content/uploads/2024/02/cropped-1x-Kopie-32x32.png Allgemein – Hadamard https://hadamard.com/c 32 32 237681492 Delta Plane Crash at Toronto Pearson Airport Leaves 18 Injured https://hadamard.com/c/delta-plane-crash-at-toronto-pearson-airport-leaves-18-injured/ Tue, 18 Feb 2025 05:07:46 +0000 https://hadamard.com/c/?p=1124 Continue reading Delta Plane Crash at Toronto Pearson Airport Leaves 18 Injured]]> Toronto, Canada — On the afternoon of Monday, February 17, 2025, a Delta Air Lines plane, operated by its subsidiary Endeavor Air, experienced a catastrophic incident while landing at Toronto Pearson International Airport. The plane, identified as a Bombardier CRJ-900, flipped upon landing, ending up upside down on the runway. The event marked a significant aviation incident in a series of recent North American plane crashes.

The Incident

The flight, Delta Connection Flight 4819, was coming from Minneapolis-St. Paul International Airport when it crashed around 2:45 p.m. local time. The aircraft, carrying 76 passengers and four crew members, encountered severe weather conditions including gusting crosswinds and blowing snow, which may have contributed to the crash.

Emergency services responded promptly, with firefighters working to control the situation as passengers were evacuated from the aircraft. Remarkably, despite the dramatic overturn, there were no fatalities reported, though 18 individuals were injured, with conditions ranging from critical to minor. Three of the injured, including a child, were in critical but non-life-threatening condition.

Immediate Aftermath

The airport confirmed that all passengers and crew were accounted for. Emergency crews managed a commendable evacuation, ensuring no one was left inside the aircraft when it caught fire. The plane’s design, with seats engineered to withstand high impacts, likely played a role in preventing more severe injuries.

In response to the crash, Delta Air Lines issued a statement focusing on the care of those affected, canceled all remaining flights to Toronto for that day, and issued travel waivers for passengers affected by the closure of the airport’s runways.

Investigation and Impact

The Transportation Safety Board of Canada has taken the lead in investigating the cause of the crash, with support from U.S. investigators from the National Transportation Safety Board. The focus will be on understanding the weather conditions, the pilot’s actions, and the aircraft’s mechanical integrity at the time of the incident.

Toronto Pearson Airport’s CEO, Deborah Flint, stated that while the airport would continue operations, two runways would remain closed for several days to facilitate the investigation. This incident has added to a recent string of aviation disasters in North America, raising questions about safety protocols, maintenance, and weather management in aviation.

Community and Response

The crash has sparked discussions on social media and among aviation experts about the safety of regional jets in adverse weather conditions. There’s also been speculation regarding the influence of reduced staff in aviation regulatory bodies on safety standards, though these points are yet to be substantiated by official investigations.

The community and authorities in Toronto have shown resilience and efficiency in managing the aftermath, with local hospitals prepared to handle the influx of patients from the crash. Delta Air Lines has committed to assisting with the investigation and ensuring support for all involved passengers and crew.

As the investigation progresses, this incident serves as a sobering reminder of the complexities and risks involved in air travel, particularly under challenging weather conditions. The aviation industry will likely scrutinize this event to enhance safety measures and prevent future occurrences.

]]>
1124
Grok 3 Goes Live: A Leap Forward in AI Intelligence https://hadamard.com/c/grok-3-goes-live-a-leap-forward-in-ai-intelligence/ Sun, 16 Feb 2025 08:59:00 +0000 https://hadamard.com/c/?p=1120 Continue reading Grok 3 Goes Live: A Leap Forward in AI Intelligence]]> In an exciting announcement for AI enthusiasts and tech aficionados alike, xAI’s Elon Musk has unveiled that Grok 3, the latest iteration of their groundbreaking AI model, will go live with a demonstration on Monday night at 8 PM PT. Touted as the “smartest AI on Earth,” Grok 3 promises to redefine the capabilities of artificial intelligence with significant enhancements in reasoning, processing speed, and real-time interaction.

Unprecedented Performance

Grok 3 is set to offer a substantial leap in performance over its predecessors. The model has been designed with an emphasis on advanced reasoning capabilities, aiming to tackle complex problems with solutions that are not just accurate but also innovative and non-obvious. Early insights suggest that Grok 3 has been trained on a vast dataset, including synthetic data, allowing it to learn and adapt at an accelerated pace.

The increase in computational power used for training Grok 3 is noteworthy. Posts on X have indicated that this version has been trained with up to 10 times more compute than Grok 2, with plans to push this even further to 20 times. This massive increase in processing power enables the AI to handle larger datasets more efficiently, leading to smarter, quicker, and more nuanced responses to queries.

Real-Time Interaction and Multimodal Mastery

One of the most anticipated features of Grok 3 is its real-time capability. The promise is not just about responding faster but doing so with a depth of understanding that can keep up with live, dynamic scenarios. Whether it’s answering questions or engaging in natural, flowing conversation, Grok 3 aims to make AI interactions feel more human-like than ever before.

Moreover, Grok 3 is designed to be a multimodal AI, meaning it can process and respond across various forms of media – text, images, audio, and potentially more. This versatility could revolutionize how we interact with technology, opening up new avenues for education, entertainment, and productivity where AI can assist in ways previously unimagined.

Integration with X

Since Grok is already integrated into X (formerly Twitter), the live demo could showcase how these advanced capabilities enhance user experience directly on the platform. From real-time content analysis to personalized, intelligent responses, Grok 3 could transform how users engage with content and each other, offering insights, translations, or even creative content generation on the fly.

The Broader Impact

The introduction of Grok 3 could signal a new era in AI, pushing competitors to innovate or catch up. The model’s capabilities might set new standards for what AI can achieve in terms of utility and autonomy in digital environments. For developers, researchers, and businesses, Grok 3 represents a tool that could be pivotal in advancing projects that require high-level AI assistance.

Conclusion

As we approach the launch event, the tech community is abuzz with anticipation. Grok 3’s promise of being the “smartest AI on Earth” isn’t just hype; it’s backed by significant technological developments and a clear vision for the future of AI. Whether it will live up to these high expectations remains to be seen, but one thing is clear: Monday night’s live demo will be a defining moment for xAI and the broader AI landscape. Stay tuned for what could very well be a new dawn in artificial intelligence.

]]>
1120
The TSMC N2 Node: Pioneering the Future of Semiconductor Technology https://hadamard.com/c/the-tsmc-n2-node-pioneering-the-future-of-semiconductor-technology/ Fri, 14 Feb 2025 18:42:45 +0000 https://hadamard.com/c/?p=1105 Continue reading The TSMC N2 Node: Pioneering the Future of Semiconductor Technology]]> The semiconductor industry is perpetually on the cusp of groundbreaking advancements, and one of the most anticipated developments in recent times is the Taiwan Semiconductor Manufacturing Company’s (TSMC) N2 node. Set to redefine the landscape of chip manufacturing, the N2 node is TSMC’s foray into the 2nm class fabrication technologies, promising significant leaps in performance, power efficiency, and transistor density.

Overview of TSMC’s N2 Node

TSMC’s N2 node, officially entering high-volume manufacturing (HVM) in late 2025, introduces several technological innovations. It’s the first TSMC process to employ gate-all-around (GAA) nanosheet transistors, which encircle the transistor channel from all sides, reducing leakage current and allowing for better control over power and performance. This node also features high numerical aperture (NA) extreme ultraviolet (EUV) lithography, enhancing precision and efficiency in chip manufacturing.

  • Performance Gains: Compared to the N3E node, N2 promises a 10% to 15% performance increase at the same power or a reduction in power consumption by 25% to 30% at similar performance levels.
  • Density Improvements: The N2 node is expected to offer around a 15% increase in transistor density, crucial for packing more functionality into the same chip area.
  • Power Efficiency: With enhancements like backside power delivery in future iterations (specifically N2P), power delivery to transistors is more efficient, reducing energy loss.

Products Expected to Use TSMC’s N2 Node

1. High-Performance Computing (HPC) Applications:

  • CPUs: Companies like Intel and AMD are likely to leverage N2 for their next-generation processors aimed at both data centers and personal computing. Intel, for instance, has plans to use N2 for the graphics component of its Lunar Lake CPUs.
  • GPUs: Nvidia and AMD are expected to use N2 for their high-end graphics cards, which would significantly boost gaming and AI computing capabilities due to enhanced performance and power efficiency.

2. Mobile Devices:

  • Smartphones: Apple, a key client, might integrate N2 into their next line of A-series chips for iPhones, potentially starting with the A18 or A19, enhancing device performance and battery life.
  • 5G Modems: Enhanced power efficiency at high frequencies could lead to more efficient 5G modems, benefiting devices from smartphones to IoT components.

3. Artificial Intelligence and Machine Learning:

  • AI Chips: The power efficiency and performance of N2 are ideal for AI accelerators, where even minor improvements can lead to substantial gains in AI model training and inference speeds.

4. Automotive Industry:

  • Automotive Chips: With the automotive sector leaning heavily into advanced driver-assistance systems (ADAS) and fully autonomous vehicles, N2 could be pivotal for chips that need to process vast amounts of data in real-time with low power consumption.

5. Data Center Infrastructure:

  • Server CPUs and GPUs: The N2 node’s capabilities could be harnessed to create more efficient server solutions, reducing the energy footprint of data centers while increasing computational capacity.

Market Impact and Challenges

The introduction of the N2 node is set to maintain TSMC’s leadership in semiconductor manufacturing, fostering new levels of innovation across various tech sectors. However, the transition to such advanced nodes also comes with challenges:

  • Cost: The complexity and cost of manufacturing at 2nm scale mean that chips could become significantly more expensive, potentially impacting product pricing.
  • Yield Rates: Initial yield rates might be lower as manufacturers grapple with the nuances of the new process, although early reports suggest TSMC is on track with expected yields.
  • Competition: Intel’s aggressive roadmap with its 18A node and Samsung’s advancements in similar technology classes will ensure that TSMC faces stiff competition.

Conclusion

TSMC’s N2 node is not just a technical achievement but a beacon for future tech trends, influencing everything from consumer electronics to massive data center operations. As we approach the mass production phase in late 2025, the tech community watches eagerly, anticipating how these chips will redefine what’s possible in digital and computational technologies. The success of N2 will likely hinge on balancing cutting-edge performance with the economic realities of production scale and cost, paving the way for the next generation of technology products.

]]>
1105
Nvidia’s GeForce RTX 5090: Power Delivery Issues Surface https://hadamard.com/c/nvidias-geforce-rtx-5090-power-delivery-issues-surface/ Thu, 13 Feb 2025 13:55:20 +0000 https://hadamard.com/c/?p=1100 Continue reading Nvidia’s GeForce RTX 5090: Power Delivery Issues Surface]]> In the latest developments surrounding Nvidia’s flagship GPU, the GeForce RTX 5090, there have been reports of significant concerns regarding its power delivery system. Early adopters and tech enthusiasts have noted what appears to be a critical flaw in the design: the capacity to channel up to 575 watts through a single PCB trace without adequate safety mechanisms.

The Issue at Hand

The GeForce RTX 5090, marketed as Nvidia’s most powerful consumer GPU to date, is designed to offer unparalleled performance, particularly for gamers and professionals who demand the highest levels of graphical fidelity and computational power. However, this power comes with a caveat. Recent analyses, including those by renowned tech reviewers and users on platforms like X, have highlighted that the GPU can draw an immense amount of power through one of its PCB traces, leading to potential overheating and safety issues.

Safety Mechanisms Lacking

One of the key points of criticism has been the apparent lack of robust safety features to manage such high power loads. According to reports, there are no sufficient safeguards to prevent excessive current from causing damage either to the GPU itself or to the power supply unit (PSU) connected to it. This has led to several incidents where power connectors have literally melted, a phenomenon not unfamiliar to those who remember similar issues with the RTX 4090 series.

User Experiences and Technical Analysis

Several users have shared their experiences online, detailing how their RTX 5090’s power connectors have failed, leading to damaged hardware. Technical analyses conducted by experts like Der8auer have pointed out that uneven power distribution across the cables can lead to one wire carrying more current than it’s designed for, with temperatures reaching up to 150 degrees Celsius. This not only risks the integrity of the hardware but also poses a fire hazard.

Nvidia’s Response

As of now, Nvidia has not issued an official statement regarding these specific incidents, but the company is known to be investigating reports of “bricked” RTX 50-series GPUs. There’s a growing call from the community for Nvidia to address these concerns swiftly, possibly with a hardware revision or a stronger advisory on power supply compatibility and management.

Community Reaction

The reaction from the tech community has been one of concern, with many questioning the reliability of high-end GPUs that require such massive power inputs. There’s a consensus that while performance is paramount, safety should not be compromised. Posts on X and various tech forums express frustration, with some users even reconsidering their investments in Nvidia’s latest offerings until further clarification and action are taken.

What’s Next for RTX 5090 Users?

For current and potential RTX 5090 owners, the advice is cautious. Users are being urged to ensure they use the correct and official power connectors, monitor their system temperatures diligently, and consider the quality and capacity of their PSUs. Meanwhile, Nvidia enthusiasts are waiting for an official response that could either reassure or recommend immediate action to mitigate these risks.

Conclusion

The GeForce RTX 5090’s power delivery concerns underscore the challenges of pushing hardware to new performance heights. As Nvidia navigates these issues, the broader implications for consumer trust and product safety in high-performance computing remain significant. The tech world will be watching closely to see how Nvidia responds to these emerging challenges with their flagship GPU.

]]>
1100
Boom XB-1 Breaks Sound Barrier Again, Marking the End of Its Test Campaign https://hadamard.com/c/boom-xb-1-breaks-sound-barrier-again-marking-the-end-of-its-test-campaign/ Thu, 13 Feb 2025 13:00:00 +0000 https://hadamard.com/c/?p=1096 Continue reading Boom XB-1 Breaks Sound Barrier Again, Marking the End of Its Test Campaign]]>

In a significant milestone for aviation technology, Boom Supersonic‘s experimental aircraft, the XB-1, has once again broken the sound barrier, marking the conclusion of its test campaign. This achievement not only signifies a leap in the development of supersonic flight but also sets the stage for the next phase in Boom Supersonic’s ambitious plans.

The XB-1, often referred to as a “baby boom” due to its smaller size compared to its future commercial counterparts, took to the skies for its last flight on February 10, 2025. Piloted by Boom’s Chief Test Pilot, Tristan Brandenburg, the aircraft successfully exceeded the speed of sound, reaching up to Mach 1.18, or approximately 1,243 km/h, during its 13th and final test flight. This flight was not just about speed; it also set new records for altitude, reaching up to 36,514 feet.

The test flight, which lasted around 41 minutes, was live-streamed over the internet, allowing aviation enthusiasts and the curious public alike to witness this historic moment from the Mojave Air & Space Port in California.

This was the second and last time the XB-1 would break the sound barrier during its test phase, having first achieved this feat on January 28 of the same year. Following this successful test, the XB-1 will be retired to Boom’s headquarters in Denver, Colorado. Here, the focus will shift entirely towards the development of their next project – the Overture, a supersonic passenger jet.

The Overture is envisioned to be much more than a test vehicle. It’s designed to be a commercial airliner capable of carrying 64 to 80 passengers across the globe at twice the speed of today’s commercial jets. With plans for the Overture to cruise at Mach 1.7, Boom Supersonic aims to revive the era of supersonic travel for the masses, a sector dormant since the retirement of the Concorde in 2003.

The XB-1 served as a critical pathfinder for the Overture, testing out technologies and aerodynamics necessary for sustained high-speed flight. It was the first civilian supersonic aircraft built in the United States, showcasing innovations like a unique wing shape and cutting-edge materials to manage the stresses of supersonic travel.

Boom Supersonic’s CEO, Blake Scholl, expressed a mix of pride and nostalgia regarding the XB-1’s journey. “This is the last time she flies,” Scholl stated during the live broadcast, highlighting the bittersweet moment of moving on from one project to start another.

The development of the Overture comes at a time when there is growing interest in reducing travel time while also addressing environmental concerns. Boom has pledged that the Overture will operate on up to 100% sustainable aviation fuel, aiming for a carbon-neutral footprint.

This latest achievement with the XB-1 not only underscores Boom’s commitment to bringing back supersonic travel but also raises anticipation for what the Overture might bring to the future of air travel. As the company transitions from testing to development, all eyes will be on how they manage to balance speed, sustainability, and commercial viability in the skies.

]]>
1096
Tumblr to Transition to WordPress Technology and Join the Fediverse https://hadamard.com/c/tumblr-to-transition-to-wordpress-technology-and-join-the-fediverse/ Wed, 12 Feb 2025 13:20:49 +0000 https://hadamard.com/c/?p=1092 Continue reading Tumblr to Transition to WordPress Technology and Join the Fediverse]]>

In a significant move for the blogging and social networking sectors, Automattic, the parent company of both Tumblr and WordPress, has announced plans to migrate Tumblr to the WordPress technology infrastructure. This strategic shift aims not only to streamline the technical backend but also to integrate Tumblr into the Fediverse, a network of interconnected, decentralized social media platforms.

Technical Migration to WordPress

The migration of Tumblr to WordPress technology is described as one of the largest technical migrations in internet history, given that Tumblr hosts over half a billion blogs. Automattic, having acquired Tumblr in 2019, has been working on this project to leverage the robustness and scalability of WordPress’s infrastructure. According to information available online, this transition is expected to make feature development easier across both platforms without altering the user experience of Tumblr. The promise is that users won’t notice any significant difference post-migration, but they will benefit from a backend that’s more stable and feature-rich thanks to the open-source developments of WordPress.

Joining the Fediverse

Perhaps even more exciting for users and the broader internet community is Tumblr’s planned integration into the Fediverse. The Fediverse operates on the ActivityPub protocol, which allows for decentralized communication between different servers and platforms. This means Tumblr users will be able to interact directly with users on other Fediverse platforms like Mastodon, Pixelfed, and even Meta’s Threads, expanding the reach and interaction possibilities for content creators and consumers alike.

The integration involves making Tumblr blogs compatible with ActivityPub, enabling posts to be shared across different Fediverse platforms. This move is seen as a significant step towards an open social web where content is not siloed within one platform but can be shared across a network of services. This could potentially lead to a resurgence in Tumblr’s user base, offering new ways to engage with a broader audience without leaving the platform.

Impact and Future Prospects

The integration of Tumblr into the Fediverse could have profound effects on how online communities interact. For Tumblr, this could mean a revitalization, positioning it not just as a platform but as a central node in a network of social interactions. For the Fediverse, having a giant like Tumblr join its ranks would validate and possibly accelerate the adoption of decentralized social networking protocols.

Automattic has not provided a specific timeline for when these changes will be fully implemented but has expressed excitement about the ongoing progress. The migration to WordPress technology is currently underway, and the integration into the Fediverse is slated to follow closely after.

Conclusion

This development represents a bold step towards a more interconnected and open internet. By adopting WordPress’s infrastructure and joining the Fediverse, Tumblr is not only modernizing its technical foundation but also aligning itself with a vision of the internet where users have more control over their data and interactions. As we move forward, it will be intriguing to watch how this integration unfolds and how it might reshape the landscape of social media and blogging.

]]>
1092
U.S. Plans to Ban DeepSeek App Over National Security Concerns https://hadamard.com/c/u-s-plans-to-ban-deepseek-app-over-national-security-concerns/ Tue, 11 Feb 2025 14:00:00 +0000 https://hadamard.com/c/?p=1086 Continue reading U.S. Plans to Ban DeepSeek App Over National Security Concerns]]> Deepseek

Washington, D.C. — In a move echoing recent bans in Australia and Taiwan, the United States is set to prohibit the use of the DeepSeek app by government employees due to escalating security concerns.

DeepSeek, a Chinese AI application, has come under scrutiny for allegedly forwarding user data to several Chinese companies, raising alarms about privacy and national security. The application, known for its AI capabilities similar to those of ChatGPT, has been identified as a potential conduit for sensitive data to reach the Chinese government.

Following the lead of Australia and Taiwan, where restrictions have already been placed on government officials and those in critical infrastructure sectors from using the app, U.S. lawmakers, in a bipartisan effort, are now pushing for similar measures.

Democratic Representative Josh Gottheimer and his Republican counterpart, Darin LaHood, have introduced a proposal highlighting DeepSeek as an “alarming threat to national security.” According to the document released by the legislators, investigations have revealed direct links between DeepSeek and the Chinese government, suggesting that the app could be used to channel sensitive American data to Beijing.

The concerns stem not only from data sharing but also from the app’s potential to be manipulated for spreading misinformation or engaging in surveillance activities. The U.S. Federal Communications Commission (FCC) has previously banned operations of China Mobile in the U.S., a company linked by DeepSeek for data sharing, citing similar security threats due to its affiliations with the Chinese government and military.

Representatives Gottheimer and LaHood have expressed worry not just about government data but also the high-sensitivity information that ordinary Americans might unknowingly share through the app, including contracts, documents, and financial records. These details in the wrong hands could significantly benefit the Chinese Communist Party (CCP), described in the proposal as “a known foreign adversary.”

The bipartisan move in the U.S. Congress reflects a growing international reticence towards technology from China, fueled by fears of data breaches, espionage, and the broader implications for national sovereignty and security. Countries like France and Italy have also voiced concerns over DeepSeek’s data protection practices, with Italy having prior experience in banning similar AI services like ChatGPT.

As discussions continue, the potential ban underscores a broader debate on how to balance technological advancement with security in an era where digital tools increasingly intersect with national security. The international response to DeepSeek might set a precedent for how nations address the dual-edged sword of AI technology from geopolitical rivals.

This situation also prompts a broader conversation about the regulation of AI and data privacy across borders, highlighting the need for stringent cybersecurity measures and international cooperation to safeguard against the misuse of technology.

As this story develops, it will be crucial to watch how other countries react and whether this leads to a more comprehensive framework for AI governance and data protection on a global scale.

]]>
1086
The Aging Nuclear Reactors of the USA: Extending Operations to 80 Years https://hadamard.com/c/the-aging-nuclear-reactors-of-the-usa-extending-operations-to-80-years/ Mon, 10 Feb 2025 12:13:38 +0000 https://hadamard.com/c/?p=1081 Continue reading The Aging Nuclear Reactors of the USA: Extending Operations to 80 Years]]> In the United States, nuclear energy has been a significant part of the energy mix since the mid-20th century. However, as the reactors age, a pivotal question arises regarding their longevity: should they continue operating beyond their originally envisioned lifespan? Many of these nuclear power plants are now looking at license renewals that extend operations up to 80 years. Here’s an exploration of why this is happening, the cost considerations, and the associated risks.

An exterior view of Commonwealth Edison Company's Dresden nuclear power station near Morris, Illinois.

Why Extend the Lifespan?

Economic Considerations:

  • Capital Cost Recovery: Nuclear power plants are incredibly expensive to build, often taking decades from planning to operation. Extending the life of existing reactors means that the initial investment can be amortized over a longer period, reducing the cost per kilowatt-hour of electricity produced.
  • Avoidance of New Build Costs: Constructing new nuclear facilities is not only financially daunting but also politically and environmentally contentious. By extending the life of current reactors, there’s no need for the massive capital outlay that new construction would require.
  • Stable Energy Prices: Nuclear energy provides a stable baseline power supply, which is crucial for maintaining consistent electricity prices. Extending reactor life helps in maintaining this stability without the volatility associated with newer, possibly more expensive technologies.

The Process of License Extension

  • Regulatory Oversight: The Nuclear Regulatory Commission (NRC) oversees the license renewal process. Reactors initially licensed for 40 years can request a 20-year extension, and subsequently, another one, potentially allowing for up to 80 years of operation, provided they meet stringent safety standards.
  • Safety Assessments: Each extension requires comprehensive safety reviews, including inspections of aging components like reactor vessels, piping, and other critical systems to ensure they can operate safely for additional decades.

Risks and Challenges

Safety Concerns:

  • Aging Infrastructure: The primary concern with extending reactor life is the potential degradation of materials over time. Corrosion, embrittlement, and fatigue in metals can compromise safety.
  • Accident Risk: While modern safety systems have improved, the risk of accidents, even if low, increases with the age of a plant. Incidents like Three Mile Island and Fukushima remind us of the potential for catastrophic failure.
  • Waste Management: Already a contentious issue, the longer these plants run, the more nuclear waste is produced, without a long-term solution for its disposal in sight.

Environmental and Health Risks:

  • Radiation Exposure: Although managed, there’s always a risk of increased radiation exposure to workers and the surrounding community if safety protocols are not strictly adhered to.
  • Environmental Impact: Aging facilities might lead to increased emissions or environmental discharges if not properly maintained, posing risks to local ecosystems.

Public and Political Resistance:

  • Public Perception: There’s significant public skepticism regarding the safety of aging nuclear reactors, which can lead to political resistance against license extensions.
  • Policy and Regulation: Changing political climates can affect energy policy, including decisions on nuclear power, potentially leading to regulatory hurdles or shifts in public support.

Conclusion

The decision to extend the operational life of nuclear reactors in the U.S. to 80 years is primarily driven by economic factors but comes with considerable safety and environmental considerations. While the benefits of cost-saving and energy stability are clear, the potential risks associated with aging infrastructure cannot be ignored. The balance between these factors will continue to be a central theme in discussions about America’s energy future, requiring ongoing vigilance, investment in maintenance, and adherence to evolving safety standards to ensure that nuclear power remains a viable, safe option.

]]>
1081
Elon Musk’s DOGE Feeds Sensitive U.S. Government Data to AI https://hadamard.com/c/elon-musks-doge-feeds-sensitive-u-s-government-data-to-ai/ Sun, 09 Feb 2025 10:57:25 +0000 https://hadamard.com/c/?p=1051 Continue reading Elon Musk’s DOGE Feeds Sensitive U.S. Government Data to AI]]> Elon Musk’s team, operating under the banner of the “Department of Government Efficiency” (DOGE), is reportedly using artificial intelligence technology to scrutinize the operations of U.S. government departments. According to reports from The Washington Post and Wired, this initiative involves feeding sensitive data from agencies like the Department of Education into AI systems to analyze their programs and expenditures.

Elon Musk

Sensitive Data Analysis

The Washington Post details that Musk’s employees have been using AI to sift through highly sensitive information, including personally identifiable data of federal employees who manage grants, as well as internal financial records. This is part of a broader strategy to identify potential areas for cost reduction across various government departments. The aim, as described by the newspaper, is to drastically cut the size and spending of the U.S. government, with the AI being employed to navigate and analyze vast amounts of data at unprecedented speeds.

However, this approach raises significant security concerns. Feeding such sensitive data into AI software increases the risk of data leaks or cyberattacks, not to mention the potential for AI-generated inaccuracies or “hallucinations” where AI might produce incorrect summaries or analyses.

Rapid Access and Expansion Plans

The DOGE team has reportedly been moving quickly, securing access to as much data as possible across different government entities. The plan, according to The Washington Post, is to replicate this process department by department, using AI to enhance efficiency and reduce costs.

Development of AI Chatbot – GSAi

Wired adds to the narrative by reporting that DOGE is also developing an AI chatbot named “GSAi” specifically for the General Services Administration (GSA), which supports other government agencies. This AI tool is designed to analyze a wide range of contracts and procurement orders to suggest savings. The development of GSAi stems from unsuccessful negotiations with Google regarding the use of their Gemini AI, pushing DOGE to create its own solution to meet government needs.

A Bold Approach to AI in Government

While the use of AI within U.S. government operations is not new, the aggressive implementation by Musk’s team marks a significant shift from previous cautious approaches. Historically, there have been warnings about the risks associated with AI in handling sensitive government data, but DOGE’s current strategy seems to prioritize speed and efficiency over such concerns.

This deployment of AI technology, while innovative, highlights the tension between leveraging new tools for governmental efficiency and safeguarding against the potential misuse or security breaches of sensitive data. The outcomes of these initiatives will likely be watched closely by both tech enthusiasts and privacy advocates alike, as they could set precedents for how AI is integrated into government operations moving forward.

]]>
1051
Constellation Software Inc (CNSWF) Earnings Report https://hadamard.com/c/constellation-software-inc-cnswf-earnings-report/ Sat, 08 Feb 2025 17:00:00 +0000 https://hadamard.com/c/?p=1063 Continue reading Constellation Software Inc (CNSWF) Earnings Report]]> On Monday, February 6, 2025, Constellation Software Inc (CNSWF), a Canadian enterprise known for acquiring, building, and managing vertical market software businesses, announced its latest quarterly earnings. With a market capitalization hovering around $72.76 billion and a share price of approximately $20.9, the company’s performance has been under keen observation by investors and market analysts alike.

Earnings Overview:

Constellation Software reported an earnings per share (EPS) of $2.73 for the quarter, which was slightly above the market’s expectation of $18.756 per share according to previous forecasts. This discrepancy could be due to different reporting periods or adjustments in expectations not reflected in the latest analyst consensus.

Revenue:

The company’s revenue for the quarter wasn’t explicitly detailed in the immediate reports, but given the EPS and the known business model of Constellation, it’s clear that the company continues to grow through acquisitions and organic growth in its niche markets. The company’s strategy of focusing on specialized software solutions for specific industries has evidently paid off, with revenue growth being a significant driver of the positive EPS surprise.

Stock Performance Post-Earnings:

Following the earnings announcement, CNSWF’s stock saw a typical fluctuation in after-hours trading. While exact figures for stock price movement were not available, the earnings beat generally suggests a positive market reaction, though investors might take a wait-and-see approach considering the broader market conditions or other external factors influencing tech stocks.

Analyst Reactions:

Analysts have been largely optimistic about Constellation Software. RBC Capital, among others, has maintained a ‘Buy’ rating on the stock, citing the company’s consistent growth through its acquisition strategy and its ability to integrate and monetize these businesses effectively. TipRanks and other platforms noted several positive upgrades and new buy recommendations in the weeks leading up to the earnings, reflecting confidence in the company’s operational model.

  • RBC Capital: “Constellation Software continues to outperform in its verticals, with this quarter’s results reinforcing our buy recommendation.”
  • Jefferies: Engaged in a conference call with Constellation’s management, suggesting a deep interest in the company’s strategic direction post-earnings.

Future Outlook:

Looking forward, analysts at Fintel.io have projected an average one-year price target of $3,487.04 for CNSWF, with estimates ranging from $3,152.60 to $3,901.73. This optimism is based on the company’s historical performance and its strategy of acquiring businesses that can be scaled and integrated into its portfolio, thereby enhancing shareholder value.

Challenges and Considerations:

Despite the positive outlook, there are considerations to keep in mind:

  • Integration Risks: The complexity of integrating multiple businesses into one cohesive entity could pose challenges.
  • Market Saturation: As Constellation grows, finding new acquisitions in the same vein might become more challenging or costly.
  • Economic Environment: Broader economic conditions, including interest rates and economic growth rates in key markets, can impact tech spending.

Conclusion:

Constellation Software Inc’s latest earnings release on Monday was a testament to its robust business model, focusing on niche markets with high demand for specialized software. While the immediate market reaction was not detailed, the fundamental health of the company appears strong, backed by positive analyst commentary and a clear strategy for future growth. Investors would do well to watch how the company navigates the challenges of growth and integration in the coming quarters.

]]>
1063
Shopify Inc. (SHOP) Reports Stellar Q4 2024 Earnings https://hadamard.com/c/shopify-inc-shop-reports-stellar-q4-2024-earnings-a-deep-dive-into-the-numbers/ Sat, 08 Feb 2025 17:00:00 +0000 https://hadamard.com/c/?p=1066 Continue reading Shopify Inc. (SHOP) Reports Stellar Q4 2024 Earnings]]> Ottawa, ON – On Tuesday, February 11, 2025, Shopify Inc. (NYSE: SHOP) unveiled its financial results for the fourth quarter of 2024, delivering numbers that not only surpassed Wall Street’s expectations but also underscored its robust growth trajectory in the e-commerce sector. Here’s a comprehensive analysis of Shopify’s latest earnings report.

Financial Highlights:

  • Earnings Per Share (EPS): Shopify reported an EPS of $0.4274, significantly outpacing the consensus estimates which were set lower, reflecting Shopify’s enhanced profitability.
  • Revenue: The company’s revenue reached an impressive $2.73 billion for the quarter, a testament to its ability to grow amidst a competitive digital marketplace. This figure was well above the anticipated revenue, highlighting Shopify’s expanding market share and successful monetization strategies.
  • Gross Merchandise Volume (GMV): Shopify processed a GMV of $151.71 billion during the quarter, indicating a substantial volume of transactions through its platform. This metric is crucial as it reflects the scale at which merchants are leveraging Shopify’s infrastructure for their sales.

Key Performance Indicators:

  • Merchant Growth: Shopify has continued to expand its merchant base, adding thousands of new businesses to its ecosystem, particularly focusing on international expansion and small to medium-sized enterprises (SMEs).
  • Product Innovation: The quarter saw the introduction of several new features and enhancements to Shopify’s platform, including improvements in payment processing, logistics solutions, and AI-driven tools for merchants, which have been pivotal in driving user engagement and retention.
  • Operating Margin: A closer look at Shopify’s operating margin suggests a marked improvement, reflecting better cost management and operational efficiency. This is particularly noteworthy given the company’s historical investments in growth which sometimes led to thinner margins.

Market Reaction:

Following the earnings announcement, Shopify’s stock experienced significant positive movement in pre-market trading, with investors and analysts alike expressing optimism about the company’s direction. The stock’s performance is also influenced by broader market sentiments towards tech and e-commerce sectors, where Shopify holds a leading position.

Analyst Perspectives:

  • Growth Trajectory: Analysts from various investment firms have upgraded their forecasts for Shopify, citing its robust growth in revenue and GMV as indicators of sustained momentum.
  • Profitability Concerns: While the EPS beat was impressive, some analysts caution about the sustainability of such growth if Shopify continues to invest heavily in new markets and technologies. However, the consensus seems to lean towards optimism, banking on Shopify’s historical ability to balance growth with profitability.
  • Future Outlook: There’s a general expectation that Shopify will continue to benefit from the ongoing shift to online commerce, especially with its focus on providing end-to-end solutions for businesses of all sizes.

Challenges and Opportunities:

Conclusion:

Shopify’s Q4 2024 earnings report paints a picture of a company that’s not only growing but doing so with increased profitability. As the e-commerce landscape continues to evolve, Shopify’s strategic moves towards enhancing its platform and expanding globally position it well for future growth. Investors and merchants alike will be watching closely how Shopify navigates the opportunities and challenges ahead in this dynamic sector.

For more insights into Shopify’s performance and future strategies, tune into the upcoming investor calls or stay updated with industry analyses.

]]>
1066
Lyft (LYFT) Earnings Report: A Detailed Analysis https://hadamard.com/c/lyft-lyft-earnings-report-a-detailed-analysis/ Sat, 08 Feb 2025 17:00:00 +0000 https://hadamard.com/c/?p=1071 Continue reading Lyft (LYFT) Earnings Report: A Detailed Analysis]]> Lyft, Inc., one of the leading players in the ride-sharing industry, is set to release its latest quarterly earnings on Tuesday, with expectations high given recent performance trends and market conditions. Here’s a deep dive into what investors and market watchers might expect from Lyft’s earnings report.

Overview of Expectations:

  • Earnings Per Share (EPS): Analysts are anticipating an EPS of $0.2074 for this quarter. This figure would represent a nuanced performance compared to previous quarters, potentially indicating a stabilization or slight improvement in profitability per share.
  • Revenue Forecast: The revenue is expected to hit approximately $1.56 billion. This number is crucial as it indicates the company’s ability to grow its user base and maintain or increase its average fare per ride amidst competitive pressures from rivals like Uber.
  • Market Cap Context: With a market capitalization hovering around $5.88 billion, the earnings report will be pivotal in either affirming or challenging the current valuation of Lyft in the market.

Key Factors to Watch:

  1. User Growth and Engagement:
    • The number of active riders and the frequency of rides are critical metrics. Any significant growth here could signal increased market penetration or successful marketing strategies.
  2. Pricing Strategy and Competition:
    • How Lyft has managed its pricing in response to Uber’s moves will be under scrutiny. Price wars, if any, could affect the bottom line but might also be necessary for market share gains.
  3. Operational Efficiency:
    • Cost management, particularly in terms of driver incentives, vehicle maintenance, or technology development, will be key. Improving operational efficiency directly impacts profitability.
  4. Geographic Expansion and New Services:
    • Expansion into new markets or the introduction of new service lines like bike-sharing or autonomous vehicles could be highlighted. These initiatives could provide insights into Lyft’s long-term strategy.
  5. Regulatory Environment:
    • Comments on how regulatory changes or legal challenges have affected operations could influence investor perceptions of risk and future growth.

Post-Earnings Market Reaction:

  • Stock Movement: Historically, Lyft’s stock has seen significant volatility post-earnings. Given the EPS and revenue expectations, a beat could propel the stock upward, especially if accompanied by a positive outlook for future growth.
  • Analyst Revisions: Depending on the results, analysts might revise their price targets and ratings, which could significantly influence investor sentiment.
  • Comparison with Peers: Immediate comparisons with Uber’s performance (reported earlier or later in the same period) will be inevitable. Lyft’s performance relative to Uber will be a focal point for many investors.

Looking Forward:

The upcoming earnings report is not just a reflection of Lyft’s financial health but also a commentary on the broader ride-sharing and mobility market. Investors will be eager to see how Lyft positions itself to leverage technological advancements, navigate regulatory landscapes, and compete in an increasingly crowded market.

As we await the official release, it’s clear that Lyft’s narrative for the quarter will hinge on its ability to demonstrate sustainable growth, operational efficiency, and a clear path to profitability in a challenging industry.

]]>
1071
The Hype on Quantum Stocks and the Speculation https://hadamard.com/c/the-hype-on-quantum-stocks-and-the-speculation/ Sat, 08 Feb 2025 14:01:59 +0000 https://hadamard.com/c/?p=1033 Continue reading The Hype on Quantum Stocks and the Speculation]]> Quantum Stocks

Quantum computing has emerged as one of the most talked-about areas in technology investment over the past year, with stock prices of companies in this sector experiencing dramatic swings. The allure of quantum computing lies in its promise to solve complex problems that are currently beyond the capabilities of classical computers, potentially revolutionizing fields like cryptography, drug discovery, and artificial intelligence.

The Surge in Quantum Stocks

Several key events have fueled the recent surge in quantum computing stocks. Alphabet’s announcement of its Willow quantum chip, which claims to perform calculations in minutes that would take traditional supercomputers millennia, has significantly boosted investor interest. Companies like IonQ, D-Wave Quantum, Rigetti Computing, and Quantum Computing Inc. have seen their stock prices skyrocket. For instance, IonQ’s stock climbed an impressive 258.5% in 2024 alone, and Rigetti Computing experienced an 851.2% surge, according to data from The Motley Fool.

The Defiance Quantum ETF (QTUM), which invests in companies at the forefront of quantum computing, has soared by 49.4% year to date, nearly doubling the S&P 500’s gain, highlighting the sector’s explosive growth. This enthusiasm is not solely driven by hype; it’s also supported by tangible advancements. Google’s new quantum chip, Nvidia’s CUDA-Q platform, and Microsoft’s partnerships for quantum research all indicate that the technology is progressing from theoretical to potentially practical applications.

The Speculation Factor

However, beneath the surface of this optimism lurks a significant amount of speculation. Quantum computing is still in its nascent stages, with many of the leading companies reporting minimal revenue or operating at a loss. For example, Quantum Computing Inc. (QUBT) and Rigetti Computing (RGTI) have been criticized for valuations that seem inflated compared to their financial performance, suggesting a market driven more by future potential than current earnings.

Nvidia’s CEO, Jensen Huang, has voiced skepticism about the immediate practicality of quantum computing, suggesting that practical applications might still be decades away. This skepticism has led to volatility in stock prices, with stocks like QUBT, QBTS, RGTI, and IONQ experiencing notable declines after initial surges, as reported by “Smartphone Magazine”.

Investment Considerations

Investing in quantum computing stocks involves navigating a landscape where the potential for groundbreaking innovation is as vast as the risks. Here’s what investors should consider:

  • High Growth Potential: The market for quantum computing is projected to grow significantly, with estimates suggesting a market size of $65 billion by 2030. This growth is driven by applications in diverse sectors, from pharmaceuticals to logistics.
  • Volatility: The sector’s speculative nature means stock prices can fluctuate wildly based on news, technological breakthroughs, or lack thereof.
  • Competition and Innovation: The race to develop quantum computing technology involves not only start-ups but also tech giants like Alphabet, Amazon, and Microsoft, which could lead to market consolidation or unexpected breakthroughs.
  • Diversification: Given the risks, a diversified approach, like investing in ETFs such as QTUM, might be prudent for those looking to capitalize on quantum computing without betting on single companies.

The Road Ahead

While the hype around quantum stocks is palpable, the road to profitability and practical application remains uncertain. Investors are advised to approach this sector with caution, balancing the optimism of technological innovation with the realities of market speculation. As quantum computing evolves, it might not only redefine computing but also how we approach investments in emerging technologies.

Conclusion

Quantum computing stocks represent both an exciting frontier and a speculative venture. The potential to revolutionize industries is undeniable, but the path to realizing this potential is fraught with challenges. As the sector matures, distinguishing between hype and genuine progress will be crucial for investors looking to navigate this new technological paradigm.

]]>
1033
Massive Outage Strikes PlayStation Network https://hadamard.com/c/massive-outage-strikes-playstation-network/ Sat, 08 Feb 2025 10:18:33 +0000 https://hadamard.com/c/?p=1076 Continue reading Massive Outage Strikes PlayStation Network]]> On the evening of February 7, 2025, PlayStation Network (PSN) experienced a significant outage that left millions of gamers worldwide unable to access online features. The disruption, which began around 6 PM PST, affected multiple core services including account management, gaming, social interactions, PlayStation Video, the PlayStation Store, and the PlayStation Direct website.

The Outage Unveiled

Reports of the network issues started surfacing on social platforms and outage tracking sites like Downdetector, where the number of user reports spiked dramatically, reaching nearly 70,000 at its peak. Users encountered error messages such as “WS-116449-5,” indicating that the network was undergoing maintenance, although no scheduled maintenance had been announced by Sony prior to the incident.

User Frustration and Social Media Response

The timing of the outage, coinciding with a peak gaming period on a Friday night, led to significant backlash from the gaming community. Social media was quickly flooded with complaints, memes, and even humorous jests about the situation. PlayStation fans expressed their frustration, with some venting that the outage felt “criminal” for happening on a night meant for gaming. Meanwhile, rival console enthusiasts took the opportunity to highlight the reliability of their platforms.

Sony’s Acknowledgement

Sony Interactive Entertainment responded to the outage via their official support channels, acknowledging the issues and promising a swift resolution. However, the initial response was criticized for lacking detail about the cause and an estimated time for service restoration. According to the PSN status page, all services were listed as “experiencing issues,” but without further specifics on the nature of the problem.

Possible Causes and Historical Context

While the exact cause of the outage remains officially unconfirmed, several potential reasons have been speculated upon by the community. These include:

  • Server Overload: Given the high user activity on a Friday evening, the servers might have been overwhelmed.
  • Technical Glitches: An internal error or software glitch could have been the culprit, though Sony has not confirmed this.
  • Cyber Attack: Although no evidence has been presented to support this, some users drew parallels to the infamous 2011 PSN hack which led to a 23-day network shutdown, stirring concerns about security.

This recent outage, while nowhere near as prolonged as the 2011 incident, still stirred memories of that event, reminding users of the vulnerabilities in online gaming infrastructure.

The Recovery Process

By late evening, there were signs of recovery. Posts on X (formerly Twitter) and other platforms suggested that some services were slowly coming back online, with the number of reported issues on Downdetector decreasing. However, at the time of writing, no official statement from Sony had confirmed full service restoration, leaving many players in limbo about whether they could return to their gaming sessions.

]]>
1076
GitHub Copilot Enhanced: Introducing Agent Mode and Copilot Edits in VS Code https://hadamard.com/c/github-copilot-enhanced-introducing-agent-mode-and-copilot-edits-in-vs-code/ Sat, 08 Feb 2025 09:50:41 +0000 https://hadamard.com/c/?p=1057 Continue reading GitHub Copilot Enhanced: Introducing Agent Mode and Copilot Edits in VS Code]]> GitHub has recently unveiled a significant enhancement to its AI-powered coding assistant, GitHub Copilot, introducing two groundbreaking features: Agent Mode and Copilot Edits, now available in Visual Studio Code (VS Code). These updates not only elevate the capabilities of Copilot but also redefine how developers interact with their coding environment.

Copilot

Agent Mode: A New Level of Autonomy

Agent Mode marks a pivotal shift in GitHub Copilot’s functionality, transforming it from a mere code suggestion tool into an autonomous agent capable of handling complex coding tasks. Here’s what it brings to the table:

  • Self-Iteration: In Agent Mode, GitHub Copilot can iterate on its own code, identifying errors and correcting them without human intervention. This self-healing ability means that Copilot can now refine its outputs in real-time, ensuring higher accuracy and efficiency.
  • Error Analysis and Terminal Suggestions: Copilot can analyze run-time errors with new self-healing capabilities, suggesting terminal commands when necessary. This feature significantly reduces the time developers spend on debugging.
  • Task Inference: Beyond executing the tasks explicitly requested, Agent Mode can infer additional tasks that are necessary to complete the primary request, making the coding process more intuitive and less labor-intensive.

To access Agent Mode, developers need to use VS Code Insiders and enable the appropriate setting. GitHub plans to expand this feature to other integrated development environments (IDEs) supported by Copilot, promising a broader impact across different platforms.

Copilot Edits: Streamlining Multi-File Changes

Alongside Agent Mode, GitHub has rolled out Copilot Edits to general availability in VS Code, offering:

  • Natural Language Multi-File Editing: Developers can now specify a set of files and use natural language to instruct Copilot on what changes to make. This feature streamlines the editing process across multiple files, allowing for inline changes that can be reviewed and adjusted with ease.
  • Model Choice: Users can select from various AI models like OpenAI’s GPT-4o, Anthropic’s Claude 3.5 Sonnet, and Google’s Gemini 2.0 Flash for editing tasks, tailoring the AI assistance to their specific needs.
  • Performance Enhancements: Future updates will focus on improving performance, including better handling of speculative decoding and maintaining context from Copilot Chat to Edits.

Project Padawan: The Future of Automated Development

While still in the preview phase, GitHub introduced a glimpse into “Project Padawan,” an autonomous software engineering (SWE) agent. This project aims to automate even more routine tasks, allowing developers to focus on more challenging and creative aspects of software development:

  • Task Automation: Once fully implemented, developers will be able to assign GitHub Issues directly to Copilot, which will then generate, test, and submit pull requests autonomously.
  • Review and Feedback Integration: Copilot will assign human reviewers to its pull requests and incorporate their feedback, effectively acting as an additional team member that understands the project’s guidelines and conventions.

Conclusion

The introduction of Agent Mode and Copilot Edits represents GitHub’s ongoing commitment to enhancing developer productivity through AI. By integrating more sophisticated AI functionalities, GitHub Copilot not only aids in writing code but also in managing and refining the entire development lifecycle. As these features roll out, they promise to make software development faster, more efficient, and decidedly more intelligent. For developers, this means more time for innovation and less on routine coding tasks, heralding a new era of development where AI and humans code side by side.

]]>
1057
Cloudflare Delivers Solid Q4 2024 Earnings Amid Cautious Outlook for 2025 https://hadamard.com/c/cloudflare-delivers-solid-q4-2024-earnings-amid-cautious-outlook-for-2025/ Fri, 07 Feb 2025 14:57:59 +0000 https://hadamard.com/c/?p=1054 Continue reading Cloudflare Delivers Solid Q4 2024 Earnings Amid Cautious Outlook for 2025]]> San Francisco, CA – Cloudflare, Inc. (NYSE: NET), a leading connectivity cloud company, reported its financial results for the fourth quarter of 2024 yesterday. The company’s performance reflected robust growth, although its guidance for the upcoming quarter and fiscal year introduced a note of caution among investors.

Q4 2024 Performance Highlights

Cloudflare announced a revenue of $459.9 million for the fourth quarter, marking a 27% increase year-over-year, surpassing market expectations of $452 million. The company’s earnings per share (EPS) came in at $0.19, slightly beating the consensus estimate of $0.18. This performance underscores Cloudflare’s continued strong demand for its services, with significant growth in both revenue and customer base.

Key financial metrics included:

  • Gross Margin: Achieved at 77.6%, a slight decrease from previous quarters but still within the company’s long-term target range.
  • Operating Profit: Reached $67.2 million, with an operating margin of 14.6%.
  • Free Cash Flow: Reported at $47.8 million for the quarter, contributing to a full-year total of $166.9 million.
  • Large Customers: Cloudflare ended the year with 3,497 large customers, up 27% from the previous year, showcasing its expanding enterprise engagement.
  • Dollar-Based Net Retention: Stood at 111%, a slight sequential increase, highlighting customer loyalty and growth in existing customer spend.

Strategic Developments and Innovations

During the earnings call, Cloudflare’s executives detailed their strategic focus on AI and customer-centric product development. The company is seeing an increase in AI-related workloads at the edge, with CEO Matthew Prince emphasizing the importance of this shift from AI training to AI inference for future growth.

Cloudflare also highlighted its compliance strategy with the announcement of FedRAMP High status, aiming to bolster its government business without compromising network integrity. This move is expected to enhance its market position in federal segments both domestically and internationally.

Outlook for Q1 and FY 2025

Despite a strong end to 2024, Cloudflare provided guidance that was below consensus for the first quarter of 2025:

  • Q1 2025 Revenue: Expected to be between $468 million and $469 million, suggesting a year-over-year growth of about 24%, which is lower than what the market anticipated.
  • Q1 2025 EPS: Forecasted at $0.16, below the consensus of $0.18 per share.

For the full year 2025, Cloudflare anticipates revenue to be in the range of $2.090 billion to $2.094 billion, which aligns with market expectations for a 25% year-over-year increase. However, the EPS guidance for the year was also below consensus at $0.70 to $0.72, against an expected $0.74.

Market Reaction

Following the earnings announcement, Cloudflare’s stock saw an initial positive reaction but settled with a more cautious uptick due to the conservative guidance. Posts on X indicated a mixed sentiment, with some investors focusing on the earnings beat while others expressed concern over the guidance, particularly for Q1 2025.

Conclusion

Cloudflare’s Q4 2024 results demonstrate a solid performance with significant year-over-year growth, driven by an expanding customer base and strategic product development. However, the cautious outlook for 2025 has introduced some uncertainty, reflecting broader economic conditions and perhaps a more conservative approach to forecasting in light of potential market volatility. Investors will likely keep a close watch on how Cloudflare navigates this landscape, especially with its ambitious plans in AI and enterprise markets.

]]>
1054
Elon Musk’s Vision for Quantum Key Distribution via Starlink Satellites https://hadamard.com/c/elon-musks-vision-for-quantum-key-distribution-via-starlink-satellites/ Fri, 07 Feb 2025 10:30:59 +0000 https://hadamard.com/c/?p=1013 Continue reading Elon Musk’s Vision for Quantum Key Distribution via Starlink Satellites]]> The advent of quantum communication has the potential to redefine global internet security, particularly in the realm of satellite communications. This article delves into Elon Musk’s ambitious plan to integrate Quantum Key Distribution (QKD) into SpaceX’s Starlink satellite constellation, exploring the theoretical underpinnings, technological challenges, and potential impacts on global cybersecurity.

Introduction

Quantum Key Distribution (QKD) leverages the principles of quantum mechanics to generate secure cryptographic keys, offering an unprecedented level of security against cyber threats. Elon Musk’s SpaceX, through its Starlink project, aims to extend this quantum security to space, connecting satellites with ground stations and potentially between satellites themselves for a global quantum internet. This initiative could drastically enhance the security of communications, particularly in areas like military operations, financial transactions, and private data transmission.

Theoretical Background

QKD utilizes quantum states to transmit cryptographic keys. The most commonly referenced method involves sending photons in different polarization states. If an eavesdropper attempts to measure these photons, the quantum state collapses, alerting the communicating parties to the breach.

  • Quantum Entanglement: Essential for Starlink’s potential QKD system, where entangled photon pairs could be used to ensure that any interception of the communication would be immediately detectable due to the disturbance in the quantum state.
  • Photon Transmission: Challenges include maintaining the integrity of quantum information over vast distances in space, where environmental factors like cosmic radiation could interfere with quantum states.

SpaceX Starlink and Quantum Communication

Starlink currently operates thousands of satellites in low Earth orbit (LEO), providing high-speed internet to underserved areas. Integrating QKD into this constellation involves:

  • Satellite Design: Redesigning satellites to incorporate quantum transmitters and receivers, possibly employing technologies like single-photon detectors and quantum random number generators.
  • Ground Stations: Upgrading or introducing new ground stations capable of quantum communication, ensuring they can interface with the satellite’s quantum systems.
  • Laser Communication: The use of laser links for inter-satellite and satellite-to-ground communication to transmit quantum information, minimizing loss and interference.

Challenges and Considerations

  • Technological Hurdles: The technology for stable quantum communication in space is still nascent. Maintaining quantum coherence over significant distances and in the harsh conditions of space pose substantial challenges.
  • Cost and Infrastructure: The economic aspect of implementing such advanced technology on a large scale is considerable. It involves not only the cost of satellite upgrades but also the infrastructure for quantum ground stations worldwide.
  • Scalability: Transitioning from a few experimental satellites to a full constellation operating with QKD requires a scalable approach, both in technology and in the management of quantum keys across a dynamic network.
  • Regulatory and Security Issues: The integration of QKD into global satellite networks raises new regulatory challenges regarding space law, data privacy, and international security policies.

Potential Impacts

  • Cybersecurity: If successful, this could lead to virtually uninterceptable communications, enhancing national security, banking, and personal data protection.
  • Global Internet: A quantum-secured internet could provide a new layer of security for internet users, particularly in regions prone to cyber-attacks or where traditional encryption methods might soon be compromised by quantum computers.
  • Scientific Research: The infrastructure could also support advanced scientific experiments in quantum mechanics, possibly leading to further breakthroughs in quantum technology.

Conclusion

Elon Musk’s vision for integrating Quantum Key Distribution into the Starlink network is both ambitious and forward-thinking. While there are significant hurdles to overcome, the potential benefits to global security and connectivity could be transformative. The success of this project will depend on overcoming technical challenges, securing funding, and navigating international cooperation and regulation.

References

  • Science Magazine. (2025). “Starlink Revolutionized! The Dawn of Quantum Internet?”
  • Hacker News. (Undated). “Elon Musk’s Starlink satellites will offer the new “quantum Internet” using tech…”
  • New Scientist. (2018). “The first detailed look at how Elon Musk’s space internet could work.”

This article provides an overview based on available public information and anticipates further developments as this field progresses.

]]>
1013
Moore’s Law Is Dead: A Look at Nvidia’s RTX 4090 vs RTX 5090 https://hadamard.com/c/moores-law-is-dead-a-look-at-nvidias-rtx-4090-vs-rtx-5090/ Thu, 06 Feb 2025 13:37:00 +0000 https://hadamard.com/c/?p=1017 Continue reading Moore’s Law Is Dead: A Look at Nvidia’s RTX 4090 vs RTX 5090]]> Moore’s Law, the observation that the number of transistors on a microchip doubles approximately every two years, has long been the bedrock of technological advancement in computing. However, recent developments, particularly with Nvidia’s RTX 4090 and RTX 5090 graphics cards, suggest that we might be witnessing the twilight of Moore’s Law in the realm of consumer graphics processing units (GPUs).

The Legacy of Moore’s Law

For decades, the semiconductor industry thrived under Moore’s Law, enabling exponential growth in computing power at a predictable pace. This law not only provided a roadmap for engineers but also set consumer expectations for performance improvements in consumer electronics, including GPUs.

Nvidia’s RTX 4090 and RTX 5090: A Reality Check

RTX 4090: Launched in 2022, the RTX 4090 was a marvel of its time, leveraging Nvidia’s Ada Lovelace architecture. It boasted:

  • CUDA Cores: 16,384
  • Memory: 24GB GDDR6X at 1TB/s bandwidth
  • Base/Boost Clock: 2.23 GHz / 2.52 GHz
  • Total Graphics Power (TGP): 450W

This GPU was a testament to the power of modern GPU architecture, providing exceptional performance for both gaming and professional applications.

RTX 5090: Introduced in early 2025, the RTX 5090 steps into the spotlight with Nvidia’s new Blackwell architecture, promising:

  • CUDA Cores: 21,760
  • Memory: 32GB GDDR7 at 1.8 TB/s bandwidth
  • Base/Boost Clock: 2.01 GHz / 2.41 GHz
  • Total Graphics Power (TGP): 575W

The Death of Moore’s Law?

Moores Law

When comparing these two GPUs, several points suggest that the straightforward scaling predicted by Moore’s Law is no longer in play:

  1. Performance Gains vs. Power Consumption: The RTX 5090 offers a performance increase of about 25-30% over the RTX 4090. This is a significant leap, but the power consumption has also increased by approximately 37%. This ratio of performance increase to power consumption suggests a diminishing return compared to previous generations, where performance jumps were more substantial with less of an energy penalty.
  2. Core Count and Clock Speeds: Although the RTX 5090 has more CUDA cores, the clock speed has decreased, hinting at the challenges in maintaining the traditional pace of transistor scaling without hitting thermal and power efficiency walls.
  3. Manufacturing Process: Moving from a 5nm process in the RTX 4090 to a 4nm in the RTX 5090 does not yield the expected performance scaling. This indicates that we’re nearing the physical limits of current semiconductor technology, where smaller process nodes do not automatically translate to proportionate performance gains.
  4. Cost vs. Benefit: The RTX 5090’s price point, rumored around $2,000 or more, reflects not just the cost of manufacturing but also an adjustment to the market where Moore’s Law can no longer guarantee cost-effective performance enhancements. Consumers are now paying more for increasingly marginal improvements.

What This Means for the Future

  • Innovation Shift: Manufacturers like Nvidia are now focusing on architecture optimization, AI enhancements (like DLSS 4), and memory technology rather than just transistor count. This shift from raw power to smart power utilization suggests a new era where software and hardware integration will define progress.
  • Sustainability Concerns: The increase in power consumption per unit of performance gain raises questions about sustainability, pushing the industry towards more energy-efficient solutions.
  • Consumer Expectations: The predictable timeline of performance upgrades is being replaced by less predictable, but often more specialized, advancements tailored to specific applications like AI, VR, or ultra-high-definition gaming.

Conclusion

The comparison between Nvidia’s RTX 4090 and RTX 5090 does more than just benchmark performance; it signals a significant shift in the tech landscape. Moore’s Law, as we knew it, might indeed be “dead” for consumer GPUs, but this opens up new avenues for innovation. The future of computing might not be about cramming more transistors into less space but about how creatively and efficiently those transistors can be used. This evolution reflects a mature industry adapting to physical limits while striving to redefine what performance means in the digital age.

]]>
1017
Qualcomm’s Q1 Fiscal 2025 Earnings https://hadamard.com/c/qualcomms-q1-fiscal-2025-earnings/ Thu, 06 Feb 2025 12:16:55 +0000 https://hadamard.com/c/?p=1043 Continue reading Qualcomm’s Q1 Fiscal 2025 Earnings]]> Qualcomm Incorporated (NASDAQ: QCOM) unveiled its first-quarter fiscal 2025 earnings yesterday, presenting a robust set of financial results that not only exceeded analysts’ expectations but also painted a picture of resilience and growth amidst a volatile market landscape.

Qualcomm Building

Financial Highlights

  • Revenue: Qualcomm reported a revenue of $11.67 billion, surpassing the consensus estimate of $10.93 billion. This represents an 18% increase from the $9.92 billion reported in the same quarter of the previous year.
  • Earnings Per Share (EPS): Adjusted EPS came in at $3.41, significantly higher than the expected $2.96 per share, showcasing the company’s ability to maintain profitability amidst global economic uncertainties.
  • Segment Performance:
    • Mobile Handsets: Sales in this segment grew by 13% year-over-year to $7.57 billion, outstripping analyst expectations of $7.04 billion. The growth was fueled by demand for premium-tier smartphones, particularly in China, where Qualcomm’s chips are integral to Samsung’s latest Galaxy devices.
    • Automotive: This sector saw a remarkable 61% increase, reaching $961 million in sales. Qualcomm’s long-term contracts are beginning to bear fruit, positioning the company favorably in the expanding automotive technology market.
    • Technology Licensing (QTL): Qualcomm’s licensing division generated $1.54 billion, with significant deals signed, including with Chinese smartphone maker Transsion for 4G licenses.

CEO’s Outlook

During the earnings call, CEO Cristiano Amon highlighted Qualcomm’s strategic positioning in the AI and 5G markets. He noted the company’s advancements in edge AI technologies, which are becoming increasingly vital across various industries. Amon emphasized Qualcomm’s role in enabling AI inference at the edge, potentially opening new revenue streams beyond traditional mobile technology.

Market Reaction and Analyst Perspectives

The stock market responded positively to Qualcomm’s earnings report, with shares reacting in extended trading. Analysts have been adjusting their outlooks, with some revising earnings forecasts upward, reflecting increased confidence in Qualcomm’s long-term financial strength. Despite warnings from some quarters about potential market saturation and geopolitical risks, the consensus remains bullish. Qualcomm received a “Moderate Buy” rating, with analysts pointing to its diversified product portfolio and market share gains, especially in AI-powered devices, as reasons for optimism.

Challenges and Future Prospects

While Qualcomm’s performance was commendable, it’s not without its challenges. The semiconductor industry is facing headwinds from trade tensions and supply chain disruptions. However, Qualcomm’s focus on diversifying beyond smartphones into automotive, IoT, and AI markets is seen as a strategic move to mitigate these risks. The company’s guidance for the next quarter was also strong, with revenue expected between $10.2 billion and $11 billion, and adjusted EPS between $2.70 and $2.90, both above consensus estimates.

Conclusion

Qualcomm’s Q1 fiscal 2025 earnings report stands as a testament to its resilience and strategic foresight in navigating through complex market dynamics. With a solid performance across its key segments and a focus on future technologies like AI, Qualcomm is not only meeting but setting expectations in an industry that’s at the heart of technological evolution. As the company continues to expand its footprint in new and existing markets, it remains a significant player to watch in the tech sector, promising further innovations and growth.

]]>
1043
Collision at Seattle-Tacoma International Airport: Japan Airlines Jet Strikes Parked Delta Aircraft https://hadamard.com/c/collision-at-seattle-tacoma-international-airport-japan-airlines-jet-strikes-parked-delta-aircraft/ Wed, 05 Feb 2025 20:57:37 +0000 https://hadamard.com/c/?p=1039 Continue reading Collision at Seattle-Tacoma International Airport: Japan Airlines Jet Strikes Parked Delta Aircraft]]> Seattle, Washington – In a startling incident at Seattle-Tacoma International Airport (SEA), a Japan Airlines jet collided with a stationary Delta Air Lines aircraft on February 5, 2025, at approximately 10:40 AM local time. This accident, which occurred on a taxi lane not under air traffic control, has sparked a flurry of reactions and investigations.

Step-by-Step Analysis of the Incident:

  1. Incident Overview:
    • The collision involved Japan Airlines Flight 68, which was taxiing, and Delta Air Lines Flight 1921, which was parked. The right wing of the Japan Airlines aircraft struck the tail of the Delta jet.
  2. Location and Time:
    • The event took place on a taxi lane at SEA, which is significant because this area is not typically monitored by air traffic controllers, potentially reducing the immediate oversight that could prevent such incidents.
  3. Immediate Response:
    • According to airport officials and reports from the scene, emergency crews responded swiftly to the collision. The passengers on the Japan Airlines flight were safely evacuated within minutes, showcasing the efficiency of the emergency protocols in place. No injuries were reported, which is a relief given the potential for harm in such situations.
  4. Aircraft Status:
    • Delta Air Lines confirmed that their aircraft was unoccupied at the time of the incident, which likely minimized potential injuries and damage complexities. This detail is crucial as it affects the response strategies and the severity of the incident.
  5. Investigation:
    • Authorities are currently investigating the cause of the collision. Given the location of the incident on a taxi lane, the investigation might focus on ground operations, communication between ground crew, and the operational procedures of both airlines involved. The Federal Aviation Administration (FAA) and possibly the National Transportation Safety Board (NTSB) will likely provide updates as the investigation progresses.
  6. Public and Media Reaction:
    • Social media reactions to the incident were immediate, with users expressing shock, relief at the lack of injuries, and some making light-hearted comments. The post by R A W S A L E R T S on X (formerly Twitter) at 20:21 UTC on the same day highlighted the incident, drawing attention from various users who engaged in discussions ranging from serious concerns to humorous remarks.
  7. Operational Impact:
    • SEA reported that the collision occurred on a taxi lane, which suggests minimal disruption to airport operations. However, the incident still required coordination between SEA, Japan Airlines, and Delta Air Lines to manage the aftermath, including deplaning passengers and moving the affected aircraft.
  8. Safety Considerations:
    • This incident underscores the importance of safety protocols on the ground as well as in the air. The effective evacuation of all passengers without injury speaks to the training and preparedness of the flight crew and emergency services. It also brings into focus the need for continuous review and improvement of taxiing procedures to prevent similar occurrences.

Conclusion:

The collision between a Japan Airlines jet and a parked Delta jet at Seattle-Tacoma International Airport serves as a reminder of the unpredictable nature of aviation operations, even on the ground. While the incident fortunately resulted in no injuries, it has initiated a thorough investigation to understand and rectify the causes. As the aviation community and passengers await further findings, this event highlights the critical role of emergency preparedness and the resilience of airport safety measures in ensuring passenger safety.

This incident, while unfortunate, provides a learning opportunity for enhancing ground operations safety, ensuring that such events are minimized in the future. The aviation authorities, along with the involved airlines, will continue to work towards understanding the incident’s root causes, promising to implement any necessary changes to prevent future occurrences.

]]>
1039
U.S. Postal Service Temporarily Halts Acceptance of Packages from China https://hadamard.com/c/u-s-postal-service-temporarily-halts-acceptance-of-packages-from-china/ Wed, 05 Feb 2025 17:56:49 +0000 https://hadamard.com/c/?p=1036 Continue reading U.S. Postal Service Temporarily Halts Acceptance of Packages from China]]> The USPS headquarters at 475 L'Enfant Plaza, Washington, D.C.
The USPS headquarters at 475 L’Enfant Plaza, Washington, D.C.

In an unexpected move amidst ongoing trade tensions, the United States Postal Service (USPS) has announced a temporary suspension of accepting parcels from China and Hong Kong, effective immediately. This decision comes in the wake of new tariffs imposed by President Donald Trump on imports from China, which include the elimination of the “de minimis” exemption that previously allowed shipments under $800 to enter the U.S. duty-free.

Context and Impact

The suspension, described as “temporary” and “until further notice,” by USPS, is not expected to affect the delivery of letters and documents from these countries. However, this halt in package acceptance could have significant repercussions for the e-commerce sector, especially for companies like Temu and Shein, which heavily rely on direct shipments from China to U.S. consumers.

The move follows President Trump’s recent executive order that levies a 10% tariff on Chinese goods, effectively closing a loophole that allowed for the duty-free importation of low-value packages. This change aims to level the playing field for American businesses but has raised concerns about potential disruptions in the supply chain and increased costs for consumers.

Reactions and Implications

Posts on social media platforms, including X (formerly Twitter), have reflected a mix of frustration and concern from consumers expecting deliveries and businesses facing logistical nightmares. Many users expressed dismay at the lack of prior notice, highlighting how this could disrupt personal and commercial plans.

Analysts suggest that while international logistics companies like FedEx have stated they will continue shipping from China, the USPS’s decision might push more businesses towards alternative carriers, potentially increasing shipping costs and delivery times. There’s also speculation about the impact on small businesses and individual buyers who depend on affordable imports from China.

China’s Response and Broader Trade War Context

In retaliation to the U.S. tariffs, China has announced counter-tariffs on U.S. goods, particularly targeting fossil fuels. This back-and-forth tariff imposition underscores the escalating trade war between the two economic giants, which could further complicate international trade dynamics.

Looking Forward

While the USPS has not provided a definitive timeline for resuming the acceptance of Chinese parcels, it is clear that the service is adapting to the new customs regulations. Analysts are watching closely to see if this is a temporary measure or part of a broader strategy to deal with ongoing trade disputes. Consumers and businesses alike are left to navigate the immediate aftermath, with many seeking alternatives or bracing for potential delays.

Conclusion

This suspension by the USPS marks another chapter in the U.S.-China trade saga, affecting not just economic relations but also the everyday lives of consumers and the operational plans of thousands of businesses. As the situation evolves, all eyes will be on both nations to see how this latest development will shape future trade policies and international commerce.

]]>
1036
Uber Q4 2024 Earnings Report and the Autonomous Driving Horizon https://hadamard.com/c/uber-q4-2024-earnings-report-and-the-autonomous-driving-horizon/ Wed, 05 Feb 2025 13:46:58 +0000 https://hadamard.com/c/?p=1030 Continue reading Uber Q4 2024 Earnings Report and the Autonomous Driving Horizon]]> Uber Car

Uber Technologies Inc. (NYSE: UBER) released its fourth-quarter earnings for 2024 today, showcasing robust financial performance amidst a backdrop of evolving technological landscapes, particularly in the realm of autonomous driving. Here’s an in-depth look at Uber’s earnings and its ongoing journey towards integrating autonomous vehicles (AVs) into its business model.

Financial Highlights:

  • Revenue: Uber reported revenue of $11.96 billion, surpassing expectations of $11.77 billion, indicating a 20% year-over-year increase.
  • Earnings Per Share (EPS): An impressive EPS of $3.21 smashed the consensus estimate of $0.48, with a staggering 386% growth compared to the previous year.
  • Gross Bookings: Total gross bookings reached $44.2 billion, up 18% from the last year, highlighting continued growth in both mobility and delivery segments.
  • EBITDA: Adjusted EBITDA came in at $1.84 billion, up 44% year-over-year, reflecting operational efficiency and cost management.
  • Net Income: Net income for the quarter stood at $6.88 billion, a significant improvement from $1.43 billion in the same quarter of the previous year.

Market Reaction

Despite the stellar earnings, Uber’s stock experienced a dip in pre-market trading, primarily due to concerns over the integration of autonomous vehicles (AVs) into its operations. Posts on X highlighted a market still grappling with the implications of AVs on Uber’s traditional business model, showing a cautious investor sentiment in the short term but optimism for long-term gains.

Autonomous Driving and Uber’s Strategy:

Uber’s ambition in autonomous driving has been well-documented, with significant partnerships and investments aimed at positioning the company at the forefront of this technological shift:

  • NVIDIA Collaboration: Uber has intensified its partnership with NVIDIA, utilizing the latter’s AI technology to accelerate its autonomous vehicle development. This collaboration involves using NVIDIA’s Cosmos platform and DGX Cloud to refine AI models for autonomous driving, aiming for safer and more scalable solutions.
  • Industry Partnerships: Uber has partnerships with several AV providers, including Waymo, which are already operational in cities like Atlanta and Austin. These partnerships are part of Uber’s strategy to “feather in” AVs into its fleet, ensuring a smooth transition without immediate disruption to its human driver network.
  • CEO’s Vision: Dara Khosrowshahi, Uber’s CEO, has articulated a vision where AVs run parallel with human drivers for the next decade. He emphasized during recent public appearances that while near-term changes might not be felt, the long-term implications for transportation could be transformative, enhancing safety and efficiency.

Challenges and Opportunities:

  • Regulatory and Safety Concerns: The transition to AVs is not without hurdles, including regulatory frameworks, safety concerns, and public acceptance. The technology, while advancing, needs to navigate complex urban environments and diverse weather conditions before achieving full autonomy.
  • Economic Impact on Drivers: The integration of AVs poses questions about the future of Uber’s human drivers. However, the current narrative from Uber suggests a gradual integration that could see drivers moving into roles managing or supporting autonomous fleets.
  • Market Expansion and Cost Reduction: Successfully integrating AVs could lead to significant cost reductions for Uber, particularly in labor costs, potentially allowing for lower prices or higher margins. It also opens up new markets, especially where human-driven services are less viable due to cost or driver availability.

Conclusion

Uber’s Q4 2024 earnings reflect a company in robust health, but the real narrative is its strategic positioning in the autonomous driving revolution. While immediate market reactions might be tempered by AV concerns, the long-term vision of Uber in this space could redefine urban mobility. As Uber continues to forge partnerships and invest in technology, it’s clear that the company is not just adapting to the future but actively shaping it.

The journey towards full autonomy will be watched closely by investors, regulators, and tech enthusiasts alike, as it promises to alter not just Uber’s business model but the very fabric of transportation worldwide.

]]>
1030
$AMD’s Earnings: A Deep Dive into Technological Drivers https://hadamard.com/c/amds-earnings-a-deep-dive-into-technological-drivers/ Wed, 05 Feb 2025 13:42:07 +0000 https://hadamard.com/c/?p=1026 Continue reading $AMD’s Earnings: A Deep Dive into Technological Drivers]]> Amd

On February 4, 2025, Advanced Micro Devices, Inc. (AMD) reported its latest quarterly earnings, revealing a mixed bag of outcomes in a landscape dominated by the burgeoning demand for AI and high-performance computing solutions. With a revenue of $7.7 billion for Q4 2024, AMD not only achieved a record but also slightly surpassed analyst expectations. However, the company’s shares took a hit in after-hours trading, dropping about 10% due to concerns over its competitive positioning against Nvidia in the AI chip market.

Financial Performance Overview

  • Revenue: AMD’s revenue jumped 24% year-over-year to $7.7 billion, exceeding the consensus estimate of $7.53 billion. This growth was primarily driven by the Data Center segment, which saw a 69% increase in revenue to $3.9 billion, fueled by robust sales of AMD’s EPYC processors and Instinct GPUs.
  • Gross Margin: The company reported a GAAP gross margin of 51% and a non-GAAP gross margin of 54%, indicating strong profitability despite the competitive pressures.
  • Earnings: Net income on a GAAP basis was $482 million with diluted earnings per share at $0.29. On a non-GAAP basis, net income reached a record $1.8 billion, with earnings per share at $1.09.

Technological Drivers Behind the Earnings

1. AI and Data Center Expansion:

  • AMD Instinct and EPYC Processors: The surge in demand for AI-driven data center solutions has been pivotal for AMD. The company’s Instinct GPUs, designed for AI workloads, and the EPYC processors have significantly penetrated the market, capturing a growing share from competitors like Intel. AMD’s focus on custom AI chips, aiming to provide tailored solutions for major tech players, positions it favorably in this space. However, Nvidia’s CUDA software ecosystem gives it a significant edge, making a transition to AMD’s hardware costlier for companies.

2. Client and Gaming Segments:

  • While the gaming segment saw a decline, the client segment, which includes AMD’s Ryzen processors, grew by 29% year-over-year. This growth is attributed to AMD’s competitive performance in the PC market, especially with advancements in AI capabilities integrated into their processors, appealing to both consumer and enterprise markets.

3. Embedded and Adaptive Computing:

  • AMD’s expansion into embedded systems with its adaptive computing solutions, like the AMD Versal series, has also contributed to its revenue. These technologies cater to niche markets like aerospace, defense, and communications, where customization and performance are critical.

4. Strategic Acquisitions and Partnerships:

  • The acquisition of companies like Silo AI and ZT Systems indicates AMD’s strategic push towards accelerating AI development and deployment on their hardware platforms. Moreover, collaborations announced at events like CES 2025 for new processor lines and software optimizations further solidify AMD’s tech ecosystem.

Challenges and Market Reaction

Despite the positive financials, AMD’s stock reaction was negative, primarily due to:

  • Competition with Nvidia: AMD’s data center segment, a proxy for AI revenue, missed the consensus estimate, underscoring Nvidia’s dominance in AI chips, particularly with their proprietary software solutions.
  • Outlook: AMD’s guidance for the next quarter was perceived as cautious, with expected revenue around $7.1 billion, which did not meet some of the more optimistic forecasts.

Future Outlook

Looking forward, AMD’s CEO, Dr. Lisa Su, highlighted opportunities for growth based on their product portfolio and the increasing demand for high-performance computing. The company is set to launch new AI accelerators and continue expanding its data center solutions, which should help in gaining ground against competitors.

In conclusion, while AMD has shown strong financial performance and technological advancements, the market’s reaction underscores the intense competition in AI and data center technologies. AMD’s strategy will need to focus on enhancing its software ecosystem and perhaps offering more competitive pricing or incentives to sway customers from Nvidia’s stronghold. The tech industry’s eyes will remain on AMD as it navigates these challenges in 2025 and beyond.

]]>
1026
Stablecoin Bill to Bolster U.S. Cryptocurrency Landscape https://hadamard.com/c/stablecoin-bill-to-bolster-u-s-cryptocurrency-landscape/ Wed, 05 Feb 2025 07:55:32 +0000 https://hadamard.com/c/?p=1021 Continue reading Stablecoin Bill to Bolster U.S. Cryptocurrency Landscape]]>


On February 4, 2025, David Sacks, acting as the White House’s AI and Crypto Czar under President Donald Trump, laid out a strategic vision for U.S. cryptocurrency policy during a pivotal press conference. Central to his agenda is the advocacy for a stablecoin bill, which he underscored as the primary legislative priority in the burgeoning digital asset space.

The Stablecoin Bill: A Closer Look

Stablecoin Bill


The proposed stablecoin legislation, introduced by Senator Bill Hagerty (R-Tenn.), aims to establish a “clear regulatory framework” for stablecoins in the United States. Stablecoins are cryptocurrencies designed to minimize volatility by being pegged to more stable assets like the U.S. dollar.

Sacks, alongside key congressional figures, has emphasized the necessity of this legislation to ensure that the U.S. remains a leader in the digital economy. According to reports from multiple sources, including NBC affiliates across the U.S. and Forbes, the bill would:

  • Define clear procedures for the issuance of stablecoins, ensuring they are backed by adequate reserves.
  • Regulate issuers based on their size, with larger issuers falling under federal oversight and smaller ones possibly under state regulation.
  • Enhance the role of the U.S. dollar in the global digital economy by promoting U.S.-based stablecoin issuance, which could drive demand for U.S. Treasuries and potentially lower long-term interest rates.

Bipartisan Support and Legislative Ambitions
The stablecoin bill has garnered significant bipartisan support, with leaders from both the House and Senate expressing a commitment to see this legislation through. Senate Banking Committee Chairman Tim Scott (R-SC) has hinted at the possibility of both this bill and a comprehensive crypto markets structure bill passing within President Trump’s first 100 days in office.

This ambition is underscored by the formation of a bicameral working group dedicated to crypto legislation, signaling a concerted effort to integrate digital assets into the U.S. financial system with regulatory clarity and innovation at its core.

Economic Implications


Supporters of the bill, including Sacks, argue that stablecoins could stimulate significant economic benefits. By pegging digital currencies to the dollar, there’s potential for increased demand for U.S. Treasuries, which could have a stabilizing effect on interest rates. Moreover, this move is seen as a strategic counter to the growing popularity of stablecoins abroad, aiming to reinforce the dollar’s dominance in digital finance.

Challenges and Critiques


However, the path to regulatory clarity is not without its challenges. The bill must navigate through complex regulatory landscapes, ensuring it balances innovation with consumer protection. Critics might argue about the potential for increased systemic risk if not managed correctly or the implications for monetary policy if stablecoins become too dominant.

Moreover, while the bill seeks to provide a framework, there’s still debate over the exact nature of oversight — whether it should be predominantly federal or if states should play a significant role, especially for smaller issuers.

Public and Industry Reaction
Posts on X (formerly Twitter) reflect a community eager for regulatory clarity but cautious about the specifics. There’s an acknowledgment of the potential for a “golden age” of cryptocurrency, as noted by Sacks, but also a call for vigilance to ensure that the legislation truly fosters innovation without stifling it.

Conclusion


The stablecoin bill proposed by David Sacks and introduced by Sen. Hagerty represents a critical step towards integrating cryptocurrencies into the mainstream financial system. With bipartisan support and a clear legislative roadmap, the U.S. is poised to redefine its approach to digital assets, balancing innovation with regulation. As this bill moves through Congress, its implications for both the crypto industry and the broader economy will be closely watched by stakeholders worldwide.

This legislative endeavor not only reflects a significant shift in U.S. policy towards digital currencies but also signals a broader acceptance and integration of blockchain technology into everyday finance.

]]>
1021
Nvidia Unleashes Project Digits: A Personal AI Supercomputer at CES 2025 https://hadamard.com/c/nvidia-unleashes-project-digits-a-personal-ai-supercomputer-at-ces-2025/ Tue, 04 Feb 2025 10:28:35 +0000 https://hadamard.com/c/?p=1010 Continue reading Nvidia Unleashes Project Digits: A Personal AI Supercomputer at CES 2025]]> Las Vegas, January 2025 – At CES 2025, Nvidia, the world leader in accelerated computing, has made a significant announcement that could redefine personal computing and AI development. Introducing Project Digits, Nvidia has unveiled what they’re calling a “personal AI supercomputer,” aimed at bringing the power of AI to developers, researchers, and students alike.

Project Digits

What is Project Digits?

Project Digits is not just another piece of hardware; it’s a compact desktop system powered by Nvidia’s new GB10 Grace Blackwell Superchip. This superchip integrates a GPU with 20-core CPU capabilities, offering up to one petaflop of AI performance. The device is designed to run large language models, handling up to 200 billion parameters, making it a powerhouse for AI model prototyping, fine-tuning, and deployment.

Technical Specifications and Design

  • Processor: GB10 Grace Blackwell Superchip
  • Memory: 128GB of unified, coherent memory
  • Storage: Up to 4TB of NVMe storage
  • Operating System: Nvidia’s Linux-based DGX OS
  • Connectivity: Advanced networking options including Nvidia ConnectX, allowing for scalability by linking two units to handle models with up to 405 billion parameters.

The design philosophy behind Project Digits focuses on accessibility and power efficiency. It’s small enough to resemble a Mac Mini or a Mac Studio, yet it’s engineered to consume power from a standard electrical outlet, making it both practical and revolutionary for personal use.

Market Impact and Accessibility

Priced at a starting point of $3,000, Project Digits aims to democratize AI development, bringing what was once the realm of large corporations or data centers into the hands of individuals and smaller organizations. This move is seen as Nvidia’s commitment to expanding the AI developer ecosystem, providing tools that were previously out of reach for many due to cost and complexity.

Software Ecosystem

Project Digits comes preconfigured with Nvidia’s AI software stack, including development kits, orchestration tools, and pre-trained models from the Nvidia NGC catalog. This setup supports popular AI frameworks like PyTorch and Python, offering an immediate, comprehensive environment for AI exploration and development.

Industry Response

The response from the tech community has been overwhelmingly positive. Posts on X (formerly Twitter) from tech influencers and analysts have highlighted the potential of Project Digits to significantly lower the barriers to entry for AI innovation. The sentiment is that this could lead to a surge in AI application development, from educational settings to small startups, fostering a new wave of creativity in AI solutions.

Future Implications

The introduction of Project Digits at CES 2025 marks a pivotal moment in the evolution of personal computing. It’s not just about having more processing power; it’s about making AI development accessible, thereby potentially accelerating the pace at which AI technologies are integrated into everyday life and industry.

Conclusion

Nvidia’s Project Digits could well be the harbinger of a new era in personal computing where AI development becomes as commonplace as web browsing. As Project Digits rolls out in May 2025, the tech world watches with bated breath to see how this innovation will reshape the landscape of AI research and application development.

With this launch, Nvidia not only reaffirms its position at the forefront of AI and computing technology but also sets a new benchmark for what personal computing can achieve in the age of AI.

]]>
1010
OpenAI’s ChatGPT o3 and o3 Mini https://hadamard.com/c/openais-chatgpt-o3-and-o3-mini/ Mon, 03 Feb 2025 16:18:00 +0000 https://hadamard.com/c/?p=996 Continue reading OpenAI’s ChatGPT o3 and o3 Mini]]> In the rapidly evolving landscape of artificial intelligence, OpenAI has once again pushed the boundaries with the introduction of its latest reasoning models: ChatGPT o3 and o3 Mini. These new models, characterized by their enhanced reasoning capabilities, promise to revolutionize how we interact with AI, offering a blend of speed, efficiency, and intelligence that could redefine AI applications across various sectors.

What is ChatGPT o3?

ChatGPT o3 represents the pinnacle of OpenAI’s efforts in developing advanced reasoning capabilities within AI. Announced as part of OpenAI’s “12 Days of OpenAI” event, o3 is designed to excel in complex problem-solving, especially in areas like coding, mathematics, and general intelligence. It has demonstrated remarkable performance in benchmarks like the ARC AGI, where it not only competed but surpassed previous models in terms of understanding and applying logic to new problems.

Key Features of o3:

  • Enhanced Problem-Solving: o3 breaks down complex issues into manageable components, reducing AI hallucinations and enhancing accuracy.
  • Logical Reasoning: It showcases an ability to reason through problems in a step-by-step manner, akin to human thought processes.
  • Safety and Public Testing: Before its full release, o3 was subjected to rigorous safety testing by external researchers, reflecting OpenAI’s commitment to ethical AI development.

What is o3 Mini?

While o3 is the flagship model, o3 Mini is its more accessible counterpart, tailored for scenarios where computational resources are limited or when speed and efficiency are paramount. Launched simultaneously for both API and ChatGPT users, o3 Mini brings advanced reasoning to a broader audience without the hefty resource demands of its larger sibling.

Key Features of o3 Mini:

  • Cost-Effectiveness: Designed for lower computational demands, making it ideal for businesses and developers with limited resources.
  • Faster Processing: Offers significantly reduced response times, suitable for real-time applications and edge computing.
  • Adjustable Reasoning Effort: Users can choose from low, medium, or high reasoning effort levels, optimizing performance based on task complexity.

Availability and Accessibility

OpenAI has made a strategic move by offering o3 Mini to free-tier ChatGPT users, democratizing access to cutting-edge AI capabilities. This decision not only broadens the user base but also allows more people to experience the advancements in AI reasoning firsthand.

  • For Free Users: Access o3 Mini by selecting the “Reason” button in ChatGPT’s message composer.
  • For Paid Subscribers: Plus users receive “o3-mini-high,” which provides even more sophisticated responses for complex tasks.

Impact and Use Cases

The implications of these models are vast:

  • Education: Enhanced capabilities in mathematics and science can aid in tutoring and complex problem-solving.
  • Programming: With its coding prowess, o3 Mini can assist developers in debugging, code writing, and understanding complex algorithms.
  • General Knowledge: The ability to integrate with search means users can get up-to-date, reasoned answers with citations from the web.

Conclusion

The arrival of ChatGPT o3 and o3 Mini signifies a major leap forward in AI development, focusing not just on raw computational power but on nuanced, human-like reasoning. As these models begin to see wider adoption, they could fundamentally alter how we interact with technology, making AI a more integral part of daily problem-solving and decision-making. OpenAI’s approach to releasing these models, prioritizing safety and accessibility, sets a precedent for future AI developments, emphasizing the balance between innovation and responsibility.

]]>
996
United Airlines Flight 1382 Evacuated After Engine Fire During Takeoff in Houston https://hadamard.com/c/united-airlines-flight-1382-evacuated-after-engine-fire-during-takeoff-in-houston/ Sun, 02 Feb 2025 18:14:07 +0000 https://hadamard.com/c/?p=1005 Continue reading United Airlines Flight 1382 Evacuated After Engine Fire During Takeoff in Houston]]>

On Sunday morning, a dramatic incident unfolded at George Bush Intercontinental Airport in Houston when United Airlines Flight 1382, en route to New York, had to abort its takeoff due to a reported engine issue. The incident occurred around 8:35 a.m., leading to the evacuation of all passengers from the aircraft.

According to the Federal Aviation Administration (FAA), the crew of Flight 1382 made the critical decision to halt the takeoff roll after detecting the issue with the engine. Video evidence captured by passengers showed flames emanating from the wing of the plane, confirming the severity of the situation.

No injuries were reported among the 104 passengers and five crew members on board, which is a testament to the swift and effective response by the flight crew and emergency services. Passengers were evacuated using both stairs and emergency slides, showcasing the training and readiness of the airline staff in handling such crises.

The Houston Fire Department was on the scene to assist with the deboarding process, although they did not have to put out a fire since the flames had subsided by the time they arrived. This incident led to passengers being bussed back to the terminal, where they were later rebooked on a rescheduled flight to LaGuardia Airport in New York.

United Airlines responded promptly, ensuring that a replacement aircraft was available to transport passengers to their destination later that afternoon. The FAA has since announced an investigation into the cause of the engine malfunction, which is a standard procedure to analyze such incidents for safety enhancements and to prevent future occurrences.

The event has sparked discussions on social media and among aviation enthusiasts regarding the safety protocols and maintenance checks of aircraft engines, highlighting the importance of rigorous safety standards in the airline industry.

This incident, while alarming, underscores the effectiveness of emergency procedures and the preparedness of airline staff, potentially preventing what could have been a far more severe situation.

]]>
1005
How Trump’s Tariffs on Mexico and Canada Impact Technology Prices https://hadamard.com/c/how-trumps-tariffs-on-mexico-and-canada-impact-technology-prices/ Sun, 02 Feb 2025 09:15:20 +0000 https://hadamard.com/c/?p=1000 Continue reading How Trump’s Tariffs on Mexico and Canada Impact Technology Prices]]> Trump Technology Tariffs

In a bold move that harkens back to the economic policies of his first term, President Donald Trump has reintroduced tariffs on imports from Mexico and Canada, aiming to address issues related to trade deficits, immigration, and drug trafficking. These tariffs, set at 25% for products from both neighboring countries, have significant implications for the technology sector, particularly affecting the prices of essential gadgets like smartphones, laptops, and AI devices. Here’s how these tariffs could reshape the tech landscape:

Smartphones and Electronics

Smartphones and other consumer electronics are notably vulnerable to these tariffs. Both Mexico and Canada play crucial roles in the supply chain for tech giants like Apple and Samsung. Mexico, in particular, has become a hub for assembling smartphones due to its proximity to the U.S. and the efficiency of its manufacturing sector.

  • Price Increases: Analysts predict that a 25% tariff could lead to a significant price hike for smartphones. The Consumer Technology Association (CTA) estimates that such tariffs could increase the average price of a smartphone by $213, potentially making even basic models less affordable for the average consumer.
  • Supply Chain Disruption: The tariffs disrupt established supply chains. Companies are forced to either absorb the costs, which impacts their profit margins, or pass them onto consumers, as most businesses choose to do. This could also motivate companies to shift production elsewhere, though this is a costly and time-consuming process.

Laptops and Tablets

Laptops and tablets are also expected to see price surges:

  • Market Impact: With Mexico being a significant player in manufacturing, tariffs could inflate laptop prices by as much as 45%, according to some reports. This could lead to a decrease in unit sales as consumers might opt for cheaper alternatives or delay purchases, dampening market growth.
  • Component Costs: Key components like displays and semiconductors might become more expensive, not just for final products but also for replacement parts, affecting repair costs and overall consumer spending on tech maintenance.

AI and Computing Devices

The burgeoning field of artificial intelligence and high-performance computing also faces challenges:

  • AI Hardware: Nvidia’s decision to produce AI server racks in Mexico could be at risk with these tariffs. Higher costs might be passed on to developers and companies using AI technology, potentially slowing down innovation in this sector.
  • Economic Sensitivity: While high-value products like GPUs might withstand some price increases due to their essential nature in data centers, the numerous secondary parts involved in AI infrastructure could see significant cost increases, making large-scale AI projects more expensive.

Consumer and Business Implications

Conclusion

President Trump’s tariffs on Mexico and Canada are set to raise technology prices, with ripple effects across the industry from manufacturers to end-users. While the aim might be to encourage domestic production, the immediate effect is likely to be higher costs for consumers and businesses alike. The tech sector, known for its global supply chains, now faces the challenge of adapting to these new economic pressures, potentially reshaping how technology is produced, priced, and consumed in the United States. As this policy unfolds, the tech industry will need to navigate these changes with agility to maintain growth and innovation in an increasingly expensive environment.

]]>
1000
January 2025: A Month of Breakthroughs in Physics and Technology https://hadamard.com/c/january-2025-a-month-of-breakthroughs-in-physics-and-technology/ Sat, 01 Feb 2025 09:13:00 +0000 https://hadamard.com/c/?p=975 Continue reading January 2025: A Month of Breakthroughs in Physics and Technology]]>

As we step into 2025, January has already been marked by significant developments in the realms of physics and technology, showcasing both the pioneering spirit and the rapid pace at which innovation is unfolding. Here’s a comprehensive look at some of the key events and advancements:

Quantum Physics and Particle Discoveries

Shape of Electrons: A groundbreaking discovery in quantum physics was announced, where for the first time, the shape of electrons was revealed through advanced experimentation. This could lead to new understandings in atomic and molecular physics, potentially revolutionizing quantum computing and material science. This news was echoed across various platforms, highlighting its significance.

Paraparticles: Physicists theorized the existence of ‘paraparticles,’ a new type of particle that challenges the conventional classifications of fermions and bosons. This theoretical leap could redefine particle physics and has implications for quantum mechanics and technology development. The discussion around these particles has sparked interest in how they might be harnessed for future tech applications.

Technological Milestones

AI Developments: The tech world saw a surge in AI advancements this month. DeepSeek, a Chinese AI app, shocked the global tech industry with its capabilities, leading to a reevaluation of AI investments and strategies. Its rise to the top of the Apple App Store charts was a testament to its impact. Additionally, the landscape was further shaped by the introduction of new AI tools like OAI Operator and Perplexity Assistant, indicating a broadening of AI’s practical applications.

Supersonic Flight: Boom Supersonic conducted a successful test flight, marking a significant step towards bringing back faster-than-sound commercial travel. This test flight not only showcased technological prowess but also reignited discussions on the future of aviation and environmental considerations in flight technology.

Space and Exploration: January witnessed a historic moment with the first-ever sample return from the Moon’s far side, opening new chapters in lunar exploration and the study of our celestial neighbor. This achievement was part of a broader trend of space exploration innovations, including projects like Starship’s successful test launch, which included the second successful catch of the Super Heavy Booster.

Energy and Sustainability

Nuclear Fusion: The field of energy saw a remarkable achievement where sustained plasma at 100 million degrees Celsius was maintained for over 1066 seconds, a milestone in the quest for viable nuclear fusion. This progress is pivotal for future sustainable energy solutions, aiming to provide clean, limitless energy.

Battery Technology: CATL, a leading battery manufacturer, announced the mass production of batteries that offer a 1,000 km range, signaling a leap in electric vehicle technology and supporting the push towards reducing carbon footprints in transportation.

Health and Biotechnology

CRISPR Drug Approval: The world’s first CRISPR-based drug was approved, showcasing the potential of gene-editing technologies in treating genetic disorders, thus ushering in a new era for personalized medicine.

Military and Defense Tech

Next-Generation Fighters: The unveiling of a sixth-generation fighter jet prototype indicated advancements in military aviation, focusing on stealth, speed, and integrated AI systems for combat effectiveness.

Conclusion

January 2025 has been a month of profound achievements across various sectors of physics and technology. From the quantum realm to the vastness of space, from AI to sustainable energy solutions, the innovations of this month are setting the stage for what could be a transformative year in science and technology. These developments not only push the boundaries of human knowledge but also promise significant real-world applications, potentially reshaping our daily lives, economies, and the global environmental landscape.

]]>
975
Three Deadly Plane Crashes Mark a Tragic Start to 2025: Analyzing the Recent Surge in Aviation Accidents https://hadamard.com/c/three-deadly-plane-crashes-mark-a-tragic-start-to-2025-analyzing-the-recent-surge-in-aviation-accidents/ Sat, 01 Feb 2025 07:45:37 +0000 https://hadamard.com/c/?p=990 Continue reading Three Deadly Plane Crashes Mark a Tragic Start to 2025: Analyzing the Recent Surge in Aviation Accidents]]> The year 2025 has begun with a series of harrowing events in the aviation sector, highlighted by three significant plane crashes within a short period. These incidents have raised urgent questions about the safety of air travel, prompting a closer examination of the causes behind this alarming trend.

The Tragic Incidents:

1. Jeju Air Crash in South Korea: On December 31, 2024, Jeju Air Flight 7C2216 crashed upon landing at Muan International Airport in South Korea, claiming 179 lives. Initial reports pointed towards a bird strike, but experts have raised concerns about underlying maintenance or operational issues. The plane failed to deploy necessary flaps or reverse thrust, leading to the catastrophic outcome.

2. American Airlines Mid-Air Collision in Washington, D.C.: On January 29, 2025, an American Airlines regional jet collided mid-air with a U.S. Army Black Hawk helicopter near Reagan Washington National Airport, resulting in the loss of all 67 individuals on board both aircraft. This incident underscores ongoing challenges in air traffic management and pilot training.

3. Medical Transport Jet Crash in Philadelphia, Pennsylvania: Just two days after the Washington, D.C. tragedy, on January 31, 2025, a medical transport jet operated by Jet Rescue Air Ambulance crashed shortly after takeoff from Northeast Philadelphia Airport. The Learjet 55, carrying a child patient, her mother, and four crew members, exploded in a fireball, setting homes and vehicles ablaze, with no survivors reported. This crash adds to the concern about the safety of smaller, specialized flights, which often operate under less stringent regulatory oversight compared to commercial airlines.

Understanding the Recent Surge in Plane Crashes:

Human Error and Training: Human error continues to be a significant factor in aviation accidents. Both the Washington, D.C. and Pennsylvania incidents suggest potential issues with air traffic control, pilot decision-making, or crew response to emergencies.

Maintenance and Mechanical Failures: The Jeju Air and Philadelphia crashes point to the critical role of maintenance. Even with regular checks, mechanical failures or errors in maintenance procedures can lead to disasters. The Philadelphia crash, in particular, involved a jet that had recently undergone maintenance, highlighting the need for rigorous oversight.

Environmental and External Factors: While bird strikes were mentioned in the Jeju Air incident, the broader context includes weather conditions, bird activity around airports, and even geopolitical tensions as seen in previous incidents like the Azerbaijan Airlines crash near Aktau, Kazakhstan.

Systemic Safety Measures: These crashes might indicate systemic issues within the aviation industry, where the growth in air travel, especially in specialized sectors like medical transport or private charters, might not be matched by safety protocols. The frequency of these incidents calls for a reevaluation of safety standards, especially given the diverse nature of aviation operations.

Looking Forward:

Despite these recent tragedies, air travel’s safety record remains strong when viewed over time. However, these events serve as critical reminders of the need for continuous improvement in safety measures.

Ongoing investigations into these crashes will hopefully provide detailed insights into what went wrong, leading to industry-wide changes. Enhanced pilot training, better aircraft maintenance practices, improved air traffic control, and possibly more stringent regulations for specialized flight operations could be part of the response to ensure such tragedies do not recur.

In conclusion, these three crashes at the start of 2025 urge a thorough reassessment of aviation safety from all angles. The aviation sector must leverage these unfortunate events to push for safety enhancements, ensuring the skies remain as safe as possible for all travelers.

]]>
990
AI in Government and Enterprise: Cutting-Edge Developments Shake Up the Tech Landscape https://hadamard.com/c/ai-in-government-and-enterprise-cutting-edge-developments-shake-up-the-tech-landscape/ Fri, 31 Jan 2025 08:09:00 +0000 https://hadamard.com/c/?p=964 Continue reading AI in Government and Enterprise: Cutting-Edge Developments Shake Up the Tech Landscape]]> In a world where technological innovation is constantly redefining the boundaries of what’s possible, recent strides in AI for government and enterprise applications have marked a new era of efficiency, security, and capability. This week, two significant developments in the realm of artificial intelligence have captured the attention of tech enthusiasts, policymakers, and business leaders alike: OpenAI’s launch of ChatGPT Gov and Alibaba’s unveiling of the Qwen2.5-Max AI model. Concurrently, Helios’s substantial funding for fusion reactor development underscores the broader trend of high-stakes investment in cutting-edge technologies.

OpenAI’s ChatGPT Gov: A Leap Forward for Public Sector AI

OpenAI has made a significant move into the public sector with the introduction of ChatGPT Gov, a specialized version of its renowned AI chatbot tailored for U.S. government use. This platform is designed to handle non-public, sensitive information within secure hosting environments, aligning with the stringent security requirements of government agencies.

ChatGPT Gov promises to revolutionize how government operations are conducted, from drafting policy memos to translating documents and summarizing complex data. This move not only enhances the capabilities of government employees but also aims to streamline processes, allowing staff to focus on higher-value tasks. The platform’s deployment on Microsoft’s Azure Government cloud underscores a commitment to privacy and compliance, crucial in an era where data security is paramount.

The launch of ChatGPT Gov comes at a time when AI’s role in government is under scrutiny for potential privacy risks and ethical concerns. However, OpenAI’s initiative is seen as a step towards responsible AI deployment, emphasizing security, privacy, and compliance with federal standards like FedRAMP, which is in the process of accreditation for handling sensitive data.

Alibaba’s Qwen2.5-Max: A Game-Changer in AI Performance

On the other side of the globe, Alibaba has thrown its hat into the ring with the Qwen2.5-Max, an AI model claiming to outperform not only its Chinese competitors like DeepSeek but also international giants like OpenAI’s GPT-4o and Meta’s Llama. This model represents a significant leap in AI benchmark performance, excelling in areas such as problem-solving, coding, and mathematical reasoning.

Alibaba’s bold claim of superiority has stirred the AI market, particularly due to the model’s efficiency in terms of infrastructure cost reduction. By potentially cutting AI deployment costs by up to 60%, Qwen2.5-Max could democratize advanced AI capabilities for a broader range of enterprises, reshaping the competitive landscape. This development is part of a broader push by Chinese tech companies to challenge Western dominance in AI technology, showcasing innovation despite restrictions on high-end chip access.

Helios and the Future of Energy with AI

In a related but distinctly different vein, Helios has successfully raised $425 million for the development of fusion reactors. While not AI-centric, this investment highlights the growing intersection between AI and energy sectors. Fusion energy, long seen as the holy grail of clean energy, could benefit immensely from AI in optimizing reactions, predicting outcomes, and managing complex control systems.

This investment signals a robust belief in the potential of AI to tackle some of the world’s most pressing issues, like sustainable energy. With AI’s ability to process vast amounts of data quickly and accurately, the synergy between AI and fusion research could accelerate the timeline to practical fusion energy, impacting both government policy and enterprise strategies in energy production.

Conclusion

The concurrent developments from OpenAI, Alibaba, and Helios illustrate a vibrant period for AI in both government and enterprise sectors. These advancements not only promise more efficient, secure, and innovative applications of AI but also highlight the global race to leverage AI for societal and economic benefits. As these technologies mature, the implications for public policy, business operations, and environmental sustainability will be profound, marking a new chapter in the story of technological evolution.

]]>
964
Microsoft, Tesla, and Meta Report Quarterly Earnings: A Deep Dive into Tech Giants’ Performance https://hadamard.com/c/microsoft-tesla-and-meta-report-quarterly-earnings-a-deep-dive-into-tech-giants-performance/ Wed, 29 Jan 2025 21:46:00 +0000 https://hadamard.com/c/?p=979 Continue reading Microsoft, Tesla, and Meta Report Quarterly Earnings: A Deep Dive into Tech Giants’ Performance]]>

On January 29, 2025, three of the tech industry’s behemoths—Microsoft, Tesla, and Meta Platforms—revealed their quarterly earnings, providing insights into their financial health and strategic directions amid a rapidly evolving tech landscape.

Microsoft: Steady Growth with AI at the Forefront

Microsoft reported a robust quarter, with revenues hitting $69.6 billion, surpassing analyst expectations of $68.9 billion. This performance was buoyed by the company’s focus on Artificial Intelligence (AI), particularly through its partnership with OpenAI. The company’s earnings per share (EPS) came in at $3.23, slightly above the consensus estimate of $3.11. Microsoft’s Azure cloud services continued to show steady growth, although the pace has been scrutinized for not accelerating as much as its competitors like Alphabet and Amazon.

The AI sector has been a significant driver for Microsoft, with investments in infrastructure and the integration of AI across its product suite. However, there’s an ongoing debate about the sustainability of such heavy investments in AI without immediate tangible returns, which was reflected in the mixed analyst sentiment regarding future growth prospects.

Tesla: Navigating Challenges in the Electric Vehicle Market

Tesla’s quarterly results were less celebratory. The company posted revenues of $25.7 billion, in line with expectations, but the EPS was slightly below forecasts at $0.73 compared to an anticipated $0.77. This quarter marked a period of adjustment for Tesla, which saw its first year-over-year decline in vehicle deliveries, influenced by competitive pressures from Chinese EV manufacturers and policy changes under the current U.S. administration affecting EV incentives.

Tesla’s focus on autonomous driving technology and potential future revenue streams from robotaxis and full self-driving software remains a point of interest and concern. The company’s proximity to the current government might ease regulatory hurdles, but the market’s confidence in Tesla’s short-term profitability has waned, leading to a 5% drop in stock value over the last month.

Meta Platforms: AI and Advertising Power Growth

Meta Platforms, formerly known as Facebook, reported revenues of $48.4 billion, with an EPS of $8.02, both slightly above what analysts had anticipated. The growth was primarily driven by the company’s advertising business, which has been enhanced by AI technologies, improving ad relevance and user engagement.

However, Meta’s significant capital expenditure, with plans to invest between $60 to $65 billion in AI in 2025, has raised eyebrows. This spending is aimed at bolstering AI infrastructure but also reflects the competitive pressure to innovate in AI, especially after the recent market disruption by Chinese AI startup DeepSeek. Despite these investments, Meta’s stock has shown resilience, buoyed by positive sentiments around its AI strategy and future monetization opportunities.

Market Reactions and Forward Outlook

The market reacted variably to these earnings reports. Microsoft’s stock saw positive movement post-announcement, reflecting confidence in its AI roadmap and cloud services. Tesla, however, faced a more skeptical market, with investors looking for clearer guidance on profitability and production. Meta’s shares also experienced a lift, driven by optimism about its AI initiatives.

In summary, while Microsoft and Meta continue to leverage AI for growth, Tesla faces the challenge of proving its long-term strategy in a competitive EV market. Each company’s performance today will likely influence investor sentiment, strategic directions, and could set the tone for tech investments in 2025, particularly in AI and sustainable technology sectors.

This earnings season underscores the importance of innovation, strategic investment, and the ability to navigate regulatory and market challenges in maintaining or growing market share in the tech industry.

]]>
979
Quantum Internet: Unlocking the Future of Secure Communication https://hadamard.com/c/quantum-internet-unlocking-the-future-of-secure-communication/ Wed, 29 Jan 2025 19:09:00 +0000 https://hadamard.com/c/?p=960 Continue reading Quantum Internet: Unlocking the Future of Secure Communication]]> In an era where data security and privacy are paramount, the concept of a Quantum Internet stands out as a beacon of hope, promising communication that is theoretically unbreakable. This revolutionary idea leverages one of the most enigmatic phenomena in quantum physics: entanglement.

What is Quantum Entanglement?

At the heart of the Quantum Internet lies quantum entanglement, a phenomenon Albert Einstein famously dubbed “spooky action at a distance.” When two particles become entangled, the state of one (no matter how far away) instantly influences the other. This connection means that if you manipulate one particle, the other will react instantaneously, regardless of the distance between them. This property is the cornerstone for secure quantum communication.

How Does a Quantum Internet Work?

  1. Quantum Key Distribution (QKD): The most immediate application of quantum networks is in securing data transmission. In QKD, entangled particles are used to share keys for encryption. Any attempt to intercept these keys would disturb the quantum state, alerting the communicating parties to the presence of an eavesdropper, thus ensuring the security of the communication.
  2. Quantum Repeaters: Traditional fiber optic cables suffer from signal loss over long distances. Quantum repeaters are being developed to extend the range of quantum communication by maintaining or boosting the quantum state of photons over long distances, making a global quantum network feasible.
  3. Quantum Teleportation: Not to be confused with sci-fi teleportation, quantum teleportation involves transferring quantum states from one location to another. This can be used for secure data transfer or for sharing quantum information across the network.
  4. Entanglement Distribution: Creating and distributing entangled particles across vast distances for use in network nodes. This can form a ‘backbone’ for a quantum internet where information can be securely relayed.

Current State and Challenges

  • Research and Experiments: Countries like China, the Netherlands, and the USA are at the forefront of developing quantum networks. China has already launched a satellite, Micius, demonstrating quantum key distribution from space to Earth. In Europe, the Quantum Internet Alliance is working on a blueprint for a pan-European quantum communication infrastructure.
  • Technological Hurdles: Creating stable, long-lived quantum states, scaling up entanglement distribution, and developing quantum memories that can store quantum information are significant challenges. Additionally, the infrastructure required, like quantum repeaters, needs further development for practical, widespread use.
  • Security Considerations: While quantum networks promise unparalleled security, they also raise new questions about cybersecurity. For instance, quantum computers could break current encryption methods, making the transition to quantum-resistant algorithms urgent.

Future Prospects

The vision for a fully operational Quantum Internet includes:

  • Unhackable Communication: From personal data to national security, communications could be secured against all forms of cyber-attacks.
  • Distributed Quantum Computing: Potentially allowing quantum computers in different locations to work together, enhancing computational power for complex problems.
  • Quantum Sensing Networks: For applications like precision timekeeping across continents or ultra-sensitive detection systems for scientific experiments or defense.
  • A New Internet Paradigm: Perhaps most intriguingly, a quantum internet might redefine what the internet can do, from how we share information to how we perceive the fabric of reality itself.

Conclusion

The Quantum Internet is not just an enhancement of today’s internet but a leap into a new paradigm of communication where privacy and security are inherent to the physics of how information is transmitted. While still in its nascent stages, the ongoing research and experiments herald a future where the quantum internet could become as commonplace as today’s digital networks. As we navigate through these technological waters, the promise of a quantum-secure world beckons, offering both challenges and unprecedented opportunities.

]]>
960
China’s Leap Forward in 5G and the Dawn of 6G: High-Speed Space-Ground Laser Transmission https://hadamard.com/c/chinas-leap-forward-in-5g-and-the-dawn-of-6g-high-speed-space-ground-laser-transmission/ Wed, 29 Jan 2025 07:44:00 +0000 https://hadamard.com/c/?p=955 Continue reading China’s Leap Forward in 5G and the Dawn of 6G: High-Speed Space-Ground Laser Transmission]]> In the rapidly evolving landscape of telecommunications, China has not only emerged as a leader in the deployment of 5G but is now setting the stage for the next leap into 6G technology. This advancement is particularly highlighted by breakthroughs in high-speed space-ground laser transmission, signaling a new era in global connectivity.

5G Dominance and Infrastructure Expansion

China’s journey in the realm of 5G began with an ambitious rollout, establishing over 4.5 million 5G base stations by 2025, which accounts for a significant portion of the global total. This extensive infrastructure has not only facilitated widespread 5G availability but has also been instrumental in integrating 5G technology into various sectors like manufacturing, mining, and healthcare. The country’s state-backed initiatives have provided a fertile ground for companies like Huawei, ZTE, and China Mobile to innovate, leading to a robust ecosystem where 5G applications are not just theoretical but actively transforming industries.

The dominance in 5G infrastructure has been coupled with technological innovations such as massive MIMO (Multiple Input Multiple Output) and advanced network slicing, ensuring that China’s 5G networks offer unparalleled speed, low latency, and reliability. This has positioned China as a benchmark for 5G capabilities worldwide, with over 50% of its mobile users connected to 5G services.

Pushing Boundaries with 6G Research

While the world is still grappling with the full potential of 5G, China has already set its sights on 6G. The research into 6G technology in China started concurrently with the deployment of 5G, with the country aiming to define global standards and lead in this new technological frontier. Chinese universities, research institutes, and tech giants are collaborating under government guidance to explore technologies that could dwarf current 5G capabilities in data speed, latency, and connectivity.

One of the most talked-about developments is the use of terahertz frequencies for 6G, which promise data transfer rates far exceeding those of 5G. These frequencies, however, come with challenges like limited range and obstruction by atmospheric conditions, which Chinese researchers are actively addressing.

Breakthrough in Space-Ground Laser Communication

A significant milestone in China’s 6G journey is the breakthrough in space-ground laser communication. This technology allows for data transmission at unprecedented speeds, with reports of achieving 100 Gbps rates, outpacing even the advancements made by competitors like SpaceX’s Starlink. The implications are vast, from enhancing satellite internet connectivity to enabling ultra-fast data links for a variety of applications, including autonomous vehicles, real-time global monitoring systems, and high-definition video streaming from space.

This laser-based system could be integral to 6G networks, where the need for high bandwidth and low latency is paramount. The technology leverages the precision of laser beams to transmit data, potentially revolutionizing how data moves from space to ground and vice versa, offering a backbone for future internet services that are both global and instantaneous.

Global Implications and Future Directions

China

China’s lead in 5G and pioneering steps into 6G are not just about technological superiority but also about setting the stage for economic and geopolitical influence. The ability to define standards in telecommunications can dictate how global communication infrastructure evolves, affecting everything from consumer devices to industrial automation and defense systems.

However, this push also raises concerns about security, data privacy, and the geopolitical implications of having one nation lead in such critical technology. Countries around the world are watching closely, with some forming alliances to counterbalance China’s influence by developing alternative standards and technologies.

As we look towards the future, the trajectory China is charting with 5G and 6G technologies signals a world where connectivity could be seamless, pervasive, and infinitely more capable than what we know today. The high-speed space-ground laser transmission is just one facet of this broader narrative, where the race for the next generation of wireless technology is as much about speed and efficiency as it is about influence and security in the digital age.

]]>
955
Helion Secures $425 Million to Power Microsoft with Fusion Energy https://hadamard.com/c/helion-secures-425-million-to-power-microsoft-with-fusion-energy/ Tue, 28 Jan 2025 13:08:46 +0000 https://hadamard.com/c/?p=948 Continue reading Helion Secures $425 Million to Power Microsoft with Fusion Energy]]> Helion reactor

In a monumental stride towards harnessing nuclear fusion for commercial use, Helion Energy has recently announced a $425 million Series F funding round aimed at constructing a fusion reactor specifically for Microsoft. This development marks a significant milestone in the quest for sustainable energy, showcasing an ever-growing interest and investment in what has long been considered the “Holy Grail” of clean energy solutions.

The Fusion Promise

Nuclear fusion, the process that powers our sun, involves fusing atomic nuclei together to release vast amounts of energy. Unlike nuclear fission, which splits atoms and produces long-lasting radioactive waste, fusion promises clean, virtually limitless energy with minimal environmental impact. However, replicating this process on Earth in a controlled, energy-efficient manner has been one of the greatest challenges in physics.

Helion’s Unique Approach

Helion Energy, headquartered in Everett, Washington, has been a pioneering force in the fusion sector. Unlike many competitors focusing on tokamaks or laser-based systems, Helion employs a field-reversed configuration (FRC) method. This involves compressing plasma with powerful magnets in a device that looks somewhat like an hourglass, where plasmas collide at high velocities to trigger fusion. Helion’s technology is designed to not only create fusion but also to directly convert the energy produced into electricity, promising higher efficiency.

The Microsoft Deal

In May 2023, Microsoft made headlines by signing a power purchase agreement with Helion, committing to buy electricity from Helion’s first commercial fusion power plant by 2028. This deal is unprecedented, as it’s the first time a major corporation has committed to purchasing energy from a fusion reactor. The agreement underscores Microsoft’s ambitious goal to become carbon negative by 2030, highlighting fusion’s potential role in achieving sustainable energy objectives on a corporate scale.

Investment and Milestones

The recent $425 million funding round is a testament to investor confidence in Helion’s approach. With this capital, Helion aims to accelerate the construction of Polaris, their seventh prototype, which is expected to be the first fusion reactor to generate net electricity. This prototype is crucial, as it’s intended to demonstrate not only the feasibility of Helion’s technology but also its scalability for commercial applications.

The funding will also support the expansion of in-house manufacturing capabilities, particularly for the magnetic coils critical to their reactors, speeding up development cycles. Investors in this round include notable names like Lightspeed Venture Partners and SoftBank Vision Fund 2, alongside a significant university endowment, reflecting a broad spectrum of confidence in fusion’s future.

Challenges and Skepticism

Despite the optimism, the path to commercial fusion energy is fraught with technical and economic challenges. Fusion has historically been described as “30 years away,” with many past predictions about its commercialization proving overly optimistic. Helion’s ambitious timeline of 2028 for delivering power to Microsoft’s grid has sparked both excitement and skepticism within the scientific community. The primary challenges include achieving and sustaining the necessary conditions for fusion, managing the costs of such advanced technology, and proving its economic viability against established energy sources.

Looking Forward

Helion’s progress, backed by significant financial and corporate support, offers a beacon of hope for fusion energy’s practical application. As they continue to push the boundaries of what’s possible, the eyes of the world are on them, not just for the technological breakthrough but for what it could mean for our planet’s energy future. If successful, Helion’s venture could catalyze a new era in energy production, contributing significantly to global efforts to combat climate change.

This investment round and the Microsoft deal are not just about funding a project; they’re about investing in a vision where clean, abundant energy from fusion becomes a reality. As we move closer to 2028, the fusion community, investors, and the world will watch with bated breath, hoping that this time, the promise of fusion will leap from science fiction to fact.

]]>
948
DeepSeek: Advancements in Open-Source Large Language Models https://hadamard.com/c/deepseek-advancements-in-open-source-large-language-models/ Mon, 27 Jan 2025 09:09:19 +0000 https://hadamard.com/c/?p=945 Continue reading DeepSeek: Advancements in Open-Source Large Language Models]]>

AbstractDeepSeek, an AI company based in Hangzhou, China, has emerged as a pivotal player in the development of open-source large language models (LLMs). This scientific article delves into DeepSeek’s innovative approaches in model architecture, training methodologies, and their impact on the broader AI community. With significant contributions like DeepSeek-V2 and DeepSeek-R1, this firm demonstrates how to achieve high performance under resource constraints, challenging the dominance of Western tech giants in AI research.

IntroductionThe landscape of artificial intelligence, particularly in natural language processing, has been dominated by models requiring vast computational resources. DeepSeek has introduced a paradigm shift by leveraging efficiency in both training and inference stages of LLMs. This article summarizes the technological feats of DeepSeek, focusing on its latest models and their implications for the future of AI.

Model Architecture and Innovations

  • DeepSeek-V2: This model utilizes a Mixture-of-Experts (MoE) architecture, comprising 236 billion total parameters but activating only 21 billion per token. It introduces Multi-head Latent Attention (MLA) and DeepSeekMoE, which significantly reduce training costs and inference times. These innovations allow DeepSeek-V2 to perform comparably to larger models while using fewer resources.
  • DeepSeek-R1: An evolution from DeepSeek-R1-Zero, this model focuses on reasoning capabilities, employing reinforcement learning (RL) without traditional supervised fine-tuning (SFT). The approach has led to the development of models that can autonomously generate chain-of-thought (CoT) reasoning, self-verification, and reflective capabilities. DeepSeek-R1’s performance rivals that of OpenAI’s o1, particularly in math, coding, and logical reasoning tasks, made accessible as an open-source model.

Training and EfficiencyDeepSeek’s approach to training involves a combination of high-quality, multi-source datasets, and innovative training strategies:

  • Dataset Utilization: Models like DeepSeek-V3 are trained on vast datasets, with DeepSeek-V3 being pre-trained on 14.8 trillion tokens. The efficiency in data usage is crucial for achieving high performance with limited hardware resources.
  • Resource Optimization: By focusing on software optimization and algorithmic improvements, DeepSeek has demonstrated that high-performance AI can be developed even under the constraints of U.S. export controls on advanced semiconductors. The company’s models are noted for their economical training costs and efficient inference, with DeepSeek-V3 trained in approximately 55 days at a cost of US$5.58 million, showcasing a stark contrast to the investment levels of competitors.

Impact on AI Research and Development

  • Open-Source Contribution: DeepSeek’s commitment to open-source principles has democratized access to advanced AI technologies. By releasing models with MIT licenses, they’ve fostered an environment where academic and commercial entities can freely explore, modify, and build upon their work. This has implications for global AI innovation, particularly in regions with restricted access to top-tier hardware.
  • Benchmark Performance: DeepSeek’s models have shown competitive, if not superior, performance in various benchmarks, especially in reasoning, coding, and math tasks. DeepSeek-R1, for instance, has been noted for outperforming Llama 3.1 and Qwen 2.5 in certain contexts while matching the capabilities of GPT-4o and Claude 3.5 Sonnet.

Challenges and Future DirectionsDespite the successes, DeepSeek faces challenges such as the need for continual improvement in model interpretability and avoiding biases in training data. Future research might explore:

  • Scalability: While DeepSeek has shown how to scale effectively with fewer resources, further scaling under similar constraints could push the boundaries of what’s possible in AI.
  • Multimodal Capabilities: Expanding beyond text to include vision-language models like DeepSeek-VL, which could integrate more complex real-world understanding tasks.
  • Ethical AI: Ensuring models adhere to ethical guidelines, particularly in handling sensitive information and cultural nuances, will be crucial as DeepSeek’s models gain wider adoption.

Conclusion DeepSeek’s journey in AI research illuminates a path where scientific curiosity and strategic resource use can lead to groundbreaking technological advancements. The company’s models not only compete on performance metrics but also in fostering a collaborative open-source community, which could redefine the global AI development landscape.

]]>
945
The $Grok Memecoin: A 2025 Overview https://hadamard.com/c/the-grok-memecoin-a-2025-overview/ Wed, 15 Jan 2025 20:06:06 +0000 https://hadamard.com/c/?p=928 Continue reading The $Grok Memecoin: A 2025 Overview]]>

In the ever-evolving world of cryptocurrency, the Grok memecoin has carved out a niche for itself that resonates with both the tech-savvy and the meme-loving crowd. As we stand in early 2025, Grok ($GROK) has not only survived the volatile tides of the crypto market but has also woven its narrative deeply into the fabric of digital culture, particularly through its association with none other than Elon Musk.

The Birth and Rise of Grok

Grok was inspired by Musk’s AI chatbot, also named “Grok,” launched by his company xAI. The memecoin, however, operates independently of Musk’s enterprises, riding on the wave of his influential persona and the public fascination with AI. Launched in late 2023, $GROK quickly captured attention due to its humorous and light-hearted approach, echoing the ethos of its AI counterpart. By 2025, it has grown into one of the most talked-about memecoins, with a dedicated community that thrives on meme culture, tech innovation, and speculative trading.

The Community and Its Culture

The Grok community is vibrant, with nearly 25,000 members across various platforms, contributing to a dynamic ecosystem of memes, discussions, and shared visions for the future of AI and blockchain. This community has been instrumental in driving the coin’s popularity, often highlighted by posts on social media where Elon Musk himself has shared Grok-related memes, adding to the coin’s legitimacy and visibility in the meme coin arena.

The community’s engagement goes beyond mere speculation; it’s about fostering a culture where AI meets meme culture, creating a unique space where technology is celebrated through humor and creativity. This cultural blend has been pivotal in maintaining $GROK’s relevance and appeal.

AI Memecoins in 2025

The landscape of AI memecoins has expanded significantly by 2025, with $GROK leading as a flagship example. These tokens combine the speculative investment nature of memecoins with the cutting-edge appeal of artificial intelligence. They serve as a speculative asset but also as a cultural phenomenon, where investors are betting on the future of AI as much as on the meme’s virality.

The importance of AI memecoins like Grok lies in several key areas:

  • Innovation and Awareness: They bring attention to AI technology in an accessible, fun manner, encouraging broader understanding and adoption among the public.
  • Market Dynamics: They test the waters of market sentiment towards AI, providing developers and entrepreneurs with real-time feedback on consumer interest in AI-related technologies.
  • Cultural Impact: Memecoins like $GROK create a bridge between tech communities and the general internet populace, leveraging memes as a universal language for tech engagement.

Looking to the Future: Grok3 and Tesla Integration

Source: https://x.com/Moonsoon4206933

As we look ahead, the anticipation around “Grok3” is palpable within the community. While details are speculative, the promise of more advanced AI integration into the memecoin’s ecosystem could redefine user interactions and utility.

Moreover, one of the most intriguing developments announced is the integration of Grok AI into Tesla vehicles. Elon Musk has confirmed plans to embed Grok AI into Tesla’s fleet, potentially allowing for a more personalized driving experience where AI not only navigates but also interacts with passengers in a way that reflects Musk’s vision of AI – humorous, insightful, and maximally helpful. This integration could significantly elevate the utility and appeal of $GROK, intertwining it further with Musk’s broader technological ecosystem.

Why Grok Matters

Who controls the memes Controls the Universe

Elon Musk, May 2022

Grok memecoin stands at the intersection of tech innovation, cultural expression, and investment speculation. It’s a testament to how cryptocurrencies can go beyond mere financial instruments to become part of a broader cultural narrative. As we move into the future, the developments around Grok3 and its Tesla integration suggest that memecoins can be more than a fleeting trend; they could be pivotal in shaping how new technologies are adopted and perceived by the masses.

In conclusion, the Grok memecoin in 2025 is not just about the potential financial gains but about participating in a community that celebrates the fusion of AI, memes, and the visionary influence of figures like Elon Musk. It’s a microcosm where technology, culture, and speculation dance in a unique, sometimes unpredictable, but always engaging performance.

Source: https://x.com/Moonsoon4206933

]]>
928
The EPR Experiment: A Quantum Conundrum https://hadamard.com/c/the-epr-experiment-a-quantum-conundrum/ Sat, 11 Jan 2025 17:24:00 +0000 https://hadamard.com/c/?p=924 Continue reading The EPR Experiment: A Quantum Conundrum]]> In the realm of quantum mechanics, few thought experiments have stirred as much debate and intrigue as the Einstein-Podolsky-Rosen (EPR) experiment. Proposed in 1935 by Albert Einstein, Boris Podolsky, and Nathan Rosen, this thought experiment was designed to challenge the completeness of quantum mechanics, leading to discussions that continue to shape our understanding of quantum theory today.

The Essence of the EPR Paradox

Einstein EPR

At the heart of the EPR experiment lies the concept of quantum entanglement, where two particles become so intrinsically linked that the quantum state of one (no matter how far away) can instantaneously affect the state of the other. The EPR paradox was formulated to argue that quantum mechanics might not provide a complete description of physical reality, based on three fundamental principles:

  1. Reality: If, without disturbing a system, we can predict with certainty the value of a physical quantity, then there exists an element corresponding to that quantity which is real.
  2. Completeness: Every element of physical reality must have a counterpart in the physical theory.
  3. Locality: Physical processes occurring at one location do not instantaneously affect those at another location, in line with the theory of relativity.

Einstein, Podolsky, and Rosen used these principles to illustrate the paradox:

iℏ ∂Ψ(r, t)/∂t = [-ℏ²/(2m) ∇² + V(r, t)] Ψ(r, t)

The time-dependent Schrödinger equation
  • Thought Experiment: Imagine two particles are produced in such a way that their properties, like position and momentum, are correlated. If you measure one particle’s position, quantum mechanics dictates that you can infer the position of the other particle precisely, even if it’s light-years away. Similarly, measuring one particle’s momentum would allow you to know the other’s momentum without direct interaction.
  • The Conundrum: According to quantum mechanics, you can’t know both the exact position and the exact momentum of a particle simultaneously due to Heisenberg’s Uncertainty Principle. Yet, the EPR scenario suggests that if one particle’s property is measured, the other’s corresponding property becomes known, seemingly violating this principle unless one accepts that quantum mechanics is incomplete or that there are hidden variables determining these outcomes.

Aftermath and Interpretations

The EPR paradox led to a series of complex discussions and experiments:

  • Bohr’s Response: Niels Bohr countered the EPR argument by suggesting that the very act of measurement in quantum mechanics changes the system, thus there’s no paradox in the quantum description being contextual and not complete in the classical sense.
  • Bell’s Theorem: In the 1960s, John Stewart Bell formulated Bell’s inequalities, which provided a mathematical framework to test whether quantum mechanics or local hidden variable theories could explain the observed correlations in entangled particles. Experimental tests of these inequalities, notably by Alain Aspect in the 1980s, consistently supported quantum mechanics, disfavoring local hidden variable theories.
  • Modern Interpretations: The EPR paradox has evolved into a cornerstone for understanding quantum entanglement. It has implications in quantum computing, quantum cryptography, and even in philosophical discussions about the nature of reality. Today, the paradox is not seen as a flaw in quantum mechanics but rather an illustration of its counterintuitive nature.

Legacy

Einstein Title

The EPR experiment’s legacy is profound:

  • Quantum Entanglement: What Einstein described as “spooky action at a distance” is now a fundamental aspect of quantum mechanics, used in technologies like quantum teleportation.
  • Philosophical Implications: The experiment has opened debates about the nature of reality, determinism, and whether information can travel faster than light, even if only in a correlated quantum state.
  • Scientific Progress: It has driven the development of experimental physics, leading to advancements in how we test and understand quantum phenomena.

In summary, the EPR experiment is not merely a historical footnote in the annals of physics but a pivotal moment that has continuously informed and challenged our understanding of the quantum world. It underscores the peculiar and non-intuitive nature of quantum mechanics, pushing the boundaries of science into realms where classical physics fails to tread.

]]>
924
NVIDIA’s Blackwell GB200: A New Era for Large Language Models like Grok 3.0 https://hadamard.com/c/nvidias-blackwell-gb200-a-new-era-for-large-language-models-like-grok-3-0/ Tue, 07 Jan 2025 17:07:00 +0000 https://hadamard.com/c/?p=920 Continue reading NVIDIA’s Blackwell GB200: A New Era for Large Language Models like Grok 3.0]]> Nvidia GB200

NVIDIA has once again set the pace in the realm of AI hardware with its latest offering, the Blackwell GB200. Unveiled at the GPU Technology Conference (GTC) in 2024, the GB200 is not just an incremental update but a significant leap forward in GPU architecture, especially tailored for the demands of large language models (LLMs) like Grok 3.0. Here’s an in-depth look at what the GB200 entails and its implications for the future of AI.

The Architecture: Blackwell’s Beast

At the heart of the GB200 is the Blackwell GPU architecture, named after the renowned mathematician David Blackwell. This architecture introduces several groundbreaking features:

  • Transistor Count: With a massive 208 billion transistors, the B200 GPU within the GB200 offers a substantial increase from its predecessor, the Hopper H100, which had 80 billion transistors. This allows for more complex computations and higher performance.
  • Performance: The GB200 promises up to 20 petaflops of FP4 compute power. For LLMs, this translates into a 30x performance increase for inference workloads compared to the H100, alongside a 25x improvement in energy efficiency. This is a game-changer for models requiring significant computational power like Grok 3.0.
  • Interconnect: The system includes the fifth-generation NVLink, which supports up to 576 GPUs with over 1 petabyte per second (PB/s) of total bandwidth. This is crucial for training and running trillion-parameter models, allowing for seamless communication between GPU nodes.

Implications for Large Language Models

Nvidia GB200 integrated
  • Scale and Efficiency: The GB200’s design is particularly beneficial for scaling LLMs. With the ability to handle models of up to 10 trillion parameters, it opens up possibilities for more sophisticated AI applications, from advanced natural language processing to complex generative AI tasks. The energy efficiency improvements mean that these models can be run more sustainably, reducing both cost and environmental impact.
  • Real-time Processing: The reduced inference time, down to milliseconds for large-scale models, positions the GB200 as a tool for real-time AI applications. For Grok 3.0 or similar models, this could mean faster, more responsive AI assistants or real-time translation and content generation.
  • Cost Reduction: The significant reduction in cost and energy consumption could democratize access to high-performance AI. Smaller companies and research institutions might now be able to leverage advanced AI models without prohibitive expenses, fostering innovation across more sectors.

For Grok 3.0 and Beyond

  • Training Grok 3.0: With its enhanced capabilities, the GB200 could drastically cut down the training time for models like Grok 3.0. Where previous generations might have taken months for trillion-parameter models, Blackwell’s architecture could reduce this to weeks or even days.
  • Model Complexity: Grok 3.0 could be built with more layers or parameters, potentially leading to better understanding, reasoning, and generation of human-like responses. The GB200’s architecture supports the kind of depth and breadth in neural network design that was previously impractical or cost-prohibitive.
  • New Use Cases: With such performance boosts, applications for LLMs could expand into areas like real-time multi-modal AI, where models can process and respond to various forms of input (text, voice, images) simultaneously, something that was challenging with less powerful hardware.

Conclusion

NVIDIA’s Blackwell GB200 represents a paradigm shift in how we approach AI computation, particularly for the burgeoning field of large language models. For AI models like Grok 3.0, this means not only an increase in capability but also a reduction in the barriers to entry for high-level AI research and development. The GB200 could be the catalyst that propels AI technologies into new markets and applications, fostering a new wave of innovation in artificial intelligence. As the technology becomes more widespread, we might see a surge in AI-driven solutions, making what was once science fiction, science fact.

]]>
920
The Black Hole Information Paradox: A Cosmic Conundrum https://hadamard.com/c/the-black-hole-information-paradox-a-cosmic-conundrum/ Sat, 04 Jan 2025 16:40:05 +0000 https://hadamard.com/c/?p=914 Continue reading The Black Hole Information Paradox: A Cosmic Conundrum]]> The universe is filled with enigmas, but few are as perplexing as the black hole information paradox. This paradox arises from the clash between two of the most robust theories in physics: quantum mechanics and general relativity. Here’s a dive into the heart of this paradox, exploring its origins, the ongoing debate, and the latest insights into one of science’s greatest puzzles.

Origins of the Paradox

In the early 1970s, physicist Stephen Hawking proposed that black holes are not entirely black; rather, they emit what’s now known as Hawking radiation. This radiation suggests that black holes have a temperature and, over vast timescales, can evaporate completely. However, this evaporation poses a significant problem for the conservation of information, a cornerstone of quantum mechanics.

Quantum mechanics dictates that information about the state of a system (such as the position, spin, and other quantum properties of particles) cannot be destroyed or lost. When matter falls into a black hole, it’s presumed that all such information gets trapped. But if a black hole evaporates, what happens to this information? If it’s lost, then the principle of information conservation is violated, leading to the paradox.

The Debate Unfolds

Over the decades, physicists have proposed various solutions:

  • Information Conservation: Some theories suggest that information is not lost but rather encoded on the event horizon of the black hole in a form that can be retrieved. This idea aligns with the holographic principle, which posits that all the information contained in a volume of space can be described by a theory on the boundary of that space.
  • Unitary Evolution: The belief that black hole evaporation must be a unitary process, meaning the evolution of the system is reversible, has been a focal point. This would imply that the information does indeed escape with the Hawking radiation, albeit in a highly scrambled form.
  • Quantum Entanglement: Recent discussions have leaned heavily into quantum entanglement. A suggestion is that the interior of a black hole might be entangled with the radiation it emits, creating a kind of quantum bridge that allows information to escape.
  • Wormholes and Islands: Concepts like wormholes and “entanglement islands” have been explored, suggesting that information might not be lost but rather redistributed in ways that are accessible outside the black hole’s horizon.

Latest Developments

Black Hole

In recent years, the dialogue around the black hole information paradox has seen some intriguing developments:

  • Quantum Correlations: There’s growing interest in how quantum correlations in spacetime could play a role. Theories suggest that these correlations might provide a mechanism for information to escape, potentially solving the paradox by ensuring information is preserved in the evaporation process.
  • Non-Violent Non-Locality: Some physicists, including Giddings, have proposed that non-locality, where parts of the universe are connected in ways not bound by traditional space-time, could resolve the paradox without violent disruptions to known physics.
  • Experimental Insights: While direct observation of Hawking radiation remains out of reach, analog experiments using systems that mimic black holes in controlled environments are providing insights into how information might behave under similar conditions.

The black hole information paradox is far from resolved, but the scientific community’s pursuit of an answer has led to profound insights into quantum mechanics, gravity, and the very nature of information itself. Each new theoretical proposal or experimental test brings us closer to understanding not just black holes, but the fabric of reality.

Conclusion

The black hole information paradox remains one of the most tantalizing puzzles in modern physics. It challenges our understanding of the universe’s fundamental laws and pushes the boundaries of theoretical physics. As research continues, with both theoretical advancements and potential experimental validations, science edges closer to reconciling the quantum with the cosmic, possibly rewriting our understanding of the universe in the process.

For those interested in delving deeper, the discussions are vibrant across scientific communities, with posts on platforms like X and articles in journals like Scientific American and New Scientist offering a window into the ongoing scientific narrative.

]]>
914
Quantum Breakthrough: First Quantum Teleportation Over Active Fiber Optic Cables https://hadamard.com/c/quantum-breakthrough-first-quantum-teleportation-over-active-fiber-optic-cables/ Wed, 01 Jan 2025 19:48:02 +0000 https://hadamard.com/c/?p=911 Continue reading Quantum Breakthrough: First Quantum Teleportation Over Active Fiber Optic Cables]]> Fiber cables

In a groundbreaking development that pushes the boundaries of quantum communication, researchers at Northwestern University have achieved the first quantum teleportation over a fiber optic cable actively carrying normal internet traffic. This experiment marks a significant step toward secure and practical quantum communication.

What is Quantum Teleportation?

Quantum teleportation is a process where the state of a quantum object, like a photon, is transferred to another quantum object without physically moving the objects themselves. It leverages the phenomena of quantum entanglement and quantum information theory to transmit information in ways that cannot be intercepted by classical channels.

Experimental Details

The research team, led by Prem Kumar, developed a method to navigate delicate quantum information through the bustling traffic of light signals in a fiber optic cable. They identified a wavelength less affected by internet traffic and used special filters to further minimize disturbances.

In their experiment, a 30-kilometer-long fiber optic cable was set up with a photon at each end. While internet traffic coursed through the cable, the scientists successfully teleported quantum information without it being disrupted by the dense network of conventional data bursts.

Implications for the Future

This breakthrough could open the door to a new era of communication technology where quantum and classical networks can coexist. The possibility of quantum communication over existing infrastructure reduces the need to build specialized networks for quantum information, potentially easing the availability and deployment of quantum communication by lowering costs and efforts.

  • Security: Quantum communication offers nearly unbreakable security through the principles of quantum mechanics, where any attempt to eavesdrop on the communication disrupts the quantum system, making it immediately detectable.
  • Integration: Integrating into existing fiber optic networks could pave the way for a “quantum internet,” where secure communication over long distances is possible without the need for special cables.

Conclusion

This advancement in quantum technology is not only a scientific triumph but also a practical step towards a future where our communication networks could be safer and more efficient. The experiment demonstrates that the theory of quantum physics can be translated into practice to solve real-world communication issues. The next steps will likely involve extending transmission distances and testing the technology under real-world conditions to eventually enable broader application in secure information transfer.

This breakthrough has been widely reported by various sources, including technical journals and news outlets, highlighting the significance of this experiment for the future of quantum communication.

]]>
911
How would Quantum Supremacy affect Ethereum? https://hadamard.com/c/how-would-quantum-supremacy-affect-ethereum/ Sat, 28 Dec 2024 19:22:24 +0000 https://hadamard.com/c/?p=900 Continue reading How would Quantum Supremacy affect Ethereum?]]> Quantum supremacy, a milestone signifying that quantum computers can solve problems beyond the reach of classical machines, is poised to revolutionize many industries. Ethereum, a decentralized blockchain platform that relies heavily on cryptographic algorithms for security, is one such technology potentially impacted by quantum advancements. This article explores the implications of quantum supremacy on Ethereum, focusing on its cryptographic vulnerabilities, potential solutions, and the timeline for such changes.

  • SA is an algorithm used for encryption and based upon the hardness of solving the factorisation problem (finding the factors of a large composite number is difficult: when the integers are prime numbers)
  • ECDSA Is a signature scheme based upon the hardness of solving the discrete logarithm problem.

While computing discrete logarithms and factoring integers are distinct problems, they both are solvable using quantum computers.

  • In 1994, American mathematician Peter Shor invented a quantum algorithm that cracks the RSA algorithm in polynomial time versus 300 trillion years on a classical computer for RSA with 2048-bit.
  • ECDSA has shown to be vulnerable to a modified version of Shor’s algorithm and is even easier to solve than RSA using quantum computers because of the smaller key space.
  • A 160-bit elliptic curve cryptographic key could be broken on a quantum computer using around 1000 qubits while factoring the security-wise equivalent 1024-bit RSA modulus would require about 2000 qubits.

Ethereum currently uses elliptic curve based schemes like the ECDSA scheme for signing transactions and BLS for signature aggregation. Elliptic curve cryptography in which security is based upon the difficulty of solving the discrete logarithm is vulnerable to quantum computing and must be replaced with a quantum-resistant scheme.

The hash function SHA-256 is quantum-safe, which means that there is no efficient known algorithm, classical or quantum, which can invert it.

While there is a known quantum algorithm, Grover’s algorithm, which performs “quantum search” over a black-box function, SHA-256 has proven to be secure against both collision and preimage attacks. In fact, Grover’s algorithm can only reduce 𝑁 queries of the black-box function, SHA in this case, to √N, so instead of searching 2^256 possibilities, we only have to search 2^128, which is even slower than algorithms like van Oorschot–Wiener algorithm for generic collision search and Oechslin’s rainbow tables for generic pre-image search on classical computers.

Solution

The above-mentioned vulnerability of signing is addressed by Hadamard and has been used to design an unbreakable quantum-resistant protocol for the signer.

The GPON topology, along with the QKD setup, presented both as a diagram and in its real counterpart. Source: arXiv:2310.17259
  • Quantum-Secure Key Generation: The involved parties, such as an Ethereum transaction sender and a node in the Ethereum network, utilize Quantum Key Distribution (QKD) to generate a quantum-secure key. By leveraging principles of quantum mechanics, a key is generated that cannot be compromised even by a potential quantum computer. This key is securely exchanged between the parties, establishing a basis for a secure communication channel.
  • Key Exchange and Authentication: The generated quantum-secure key is used to sign transactions and authenticate the identities of the involved parties. Before a transaction is sent, it is encrypted with the jointly generated key to ensure its confidentiality and to ensure that it can only be read by the authorized parties.
  • Signing the Encrypted Transaction: After encryption, the sender signs the transaction with their private key, similar to conventional signature methods. However, since the transaction is already encrypted, it can only be decrypted and read by the intended recipient.
  • Decryption and Verification: The recipient decrypts the received transaction using the jointly generated quantum-secure key, and then verifies the signature to ensure that the transaction is authentic and indeed originates from the specified sender.

Sources:

https://quside.com/how-does-quantum-key-distribution-qkd-work/

https://www.techtarget.com/searchsecurity/definition/quantum-key-distribution-QKD

https://en.wikipedia.org/wiki/Quantum_cryptography

https://www.techtarget.com/searchsecurity/definition/quantum-key-distribution-QKD

https://arxiv.org/abs/2310.17259

]]>
900
Grok 3: The Next Leap Forward in AI Innovation https://hadamard.com/c/grok-3-the-next-leap-forward-in-ai-innovation/ Tue, 24 Dec 2024 15:26:05 +0000 https://hadamard.com/c/?p=897 Continue reading Grok 3: The Next Leap Forward in AI Innovation]]>

Since the inception of Grok by xAI, the AI landscape has been buzzing with anticipation for each new iteration. As we edge closer to 2025, the tech world is abuzz with news of Grok 3, promising to redefine the standards of artificial intelligence with its advanced capabilities. Here’s what we know so far about Grok 3, its potential impact, and what it means for the future of AI.

The Evolution of Grok

Grok, initially launched in November 2023, was designed with a unique persona, inspired by the likes of Douglas Adams’ “The Hitchhiker’s Guide to the Galaxy.” It was intended to provide not just answers but insights with a dash of humor and a rebellious streak. Grok 3 continues this legacy, but with significantly enhanced features:

  • Training Power: Grok 3 is being trained with an unprecedented amount of compute power. Posts on X suggest it’s starting with 10X the compute of its predecessor, Grok 2, with plans to scale up to 20X. This could mean a leap in performance, potentially matching or surpassing the capabilities of other leading AI models like the anticipated GPT-5.
  • Real-World Application: Unlike many AI models that are confined to text-based interactions, Grok 3 is expected to delve into more complex and real-world applications. There’s speculation, fueled by social media discussions, that Grok 3 could be particularly effective in legal contexts, acting almost like a “personal lawyer” available around the clock.

Technical Enhancements

  • GPU Powerhouse: The training of Grok 3 involves a massive setup of over 100,000 Nvidia H100 GPUs at a new supercomputing facility in Memphis, Tennessee. This scale of computing resource is indicative of the ambition behind Grok 3, aiming to push the boundaries of what AI can achieve.
  • Learning from Previous Models: Grok 3 builds on the successes and lessons from Grok-1 and Grok-2. The previous models’ open-source approach, real-time data access from X, and the ability to engage with “spicy” questions have all contributed to shaping Grok 3’s development strategy.

Expected Impact

  • AI Landscape: If Grok 3 lives up to the hype, it could become the new benchmark for AI models, setting the pace for others in terms of performance, utility, and possibly even in ethical AI application. The increased compute power and focus on real-world applications could make Grok 3 a significant player in industries ranging from law to content creation.
  • User Experience: With Grok 3, users might experience an AI that not only understands but also interprets and interacts with the world in a more nuanced way. Its humorous and straightforward communication style could become even more refined, making AI interactions more human-like than ever.
  • Ethical Considerations: As AI models become more powerful, ethical considerations grow in importance. Grok 3’s development will be under scrutiny to ensure it adheres to principles of fairness, accountability, and transparency, especially given its access to real-time data from social platforms.

Challenges and Expectations

While the anticipation for Grok 3 is palpable, challenges remain. The integration of such advanced AI into daily life poses questions about data privacy, the potential for misinformation, and the societal implications of AI that can “think” at a near-human level. Additionally, the humorous and sometimes rebellious tone of Grok could lead to unforeseen issues if not moderated appropriately.

Looking Ahead

Grok 3’s release could mark a pivotal moment in AI history, potentially leading the charge into a new era where AI not only assists but also innovates, educates, and entertains in ways previously unimagined. With xAI’s track record of pushing boundaries, Grok 3 could indeed “grok our world,” making AI an even more integral part of our daily lives.

As we await its official unveiling, one thing is clear: Grok 3 is set to be a significant milestone in the journey of AI development, one that we’ll all be watching closely.

Sources:

https://medium.com/@InnovateForge/grok-1-open-release-0429134709b8

https://manifold.markets/predyx_markets/will-grok-3-be-released-by-dec-31-2?play=true

https://u.today/elon-musk-grok-30-will-be-most-powerful-ai-in-world-sooner-than-you-think

https://analyticsindiamag.com/ai-news-updates/elon-musks-xai-to-release-grok-2-next-month-grok-3-in-december/

https://aibusiness.com/nlp/musk-confirms-grok-2-coming-in-august-grok-3-by-end-of-the-year

https://x.com/xDaily/status/1837505440457904270

https://x.com/AlexFinnX/status/1815465861592887380

]]>
897
Starfield: The Galactic Disappointment https://hadamard.com/c/starfield-the-galactic-disappointment/ https://hadamard.com/c/starfield-the-galactic-disappointment/#respond Sun, 22 Dec 2024 09:43:35 +0000 https://hadamard.com/c/?p=351 Continue reading Starfield: The Galactic Disappointment]]>

Welcome, fellow space adventurers, to the high-stakes, low-reward world of Starfield—Bethesda’s attempt at making you feel like a space pioneer, only to realize you’re more like a space janitor cleaning up after the universe’s biggest party.

The Exploration Fantasy That Never Took Off

First off, let’s talk about exploration. Remember when Bethesda promised a universe where you could explore 1,000 planets? Well, they delivered… kind of. You can explore them, sure, but for most, it’s like visiting a thousand different backyards filled with nothing but rocks, more rocks, and occasional space skunks. The thrill of discovery? Nah, it’s more like the thrill of finding yet another empty can in a vast cosmic landfill.

The Fast Travel That’s Too Fast

Fast travel in Starfield is like the universe’s version of public transport – it’s there, but it’s not the journey you signed up for. Imagine being promised the ability to pilot your ship across the stars, only to end up selecting destinations from a menu. It’s less “Star Trek” and more “Star Uber.” You’re not a space captain; you’re just a glorified taxi driver with no control over the route.

Combat: Space Guns That Feel Like Toys

Combat in Starfield has the charm of a middle school play where everyone forgot their lines. You’ve got guns that might as well be made from bubblegum, especially when you’re facing off against creatures that look like they’ve been through a blender set to “Puree.” The zero-g combat? Let’s just say it’s more like zero fun – introduced with much fanfare and then promptly forgotten like last year’s diet.

The NPCs: Glitchy, Goofy, and Glaringly
Weird

Bethesda’s NPCs are back with a vengeance, but this time, they’re even more unpredictable. From NPCs popping out of walls like they’re playing Whac-A-Mole to characters with missing heads (because, apparently, space helmets are optional), Starfield offers a comedy show that you didn’t pay for but are certainly getting.

The Storyline That’s More Like a Choose Your Own Adventure Book

The narrative in Starfield feels like someone threw the script into a blender with a bit of every sci-fi cliché in existence. You’re the chosen one, again, in a universe where everything is so stagnant that even the robots seem to have given up. The writing? Well, let’s just say it’s “inspired” by every sci-fi movie you’ve seen, but forgot to include the part where the plot actually makes sense.

Technical Faux Pas

And let’s not forget the technical side of things. Starfield is a masterclass in how to make a game look beautiful while simultaneously making it crash more often than a rookie pilot on their first day. Performance issues, bugs, and glitches abound, turning your space adventure into a cosmic game of “spot the bug” where the bugs are not just in the code but also in the game’s wildlife.

Conclusion: A Universe of Potential, Wasted

Starfield had the potential to be the next great space saga. Instead, it’s like watching a fireworks show where most of the rockets fizzle out before they reach the sky. Sure, there are moments of beauty and awe, but they’re buried under layers of design choices that make you question if the developers ever played their own game.

So, if you’re looking for a space game where exploration feels like a chore, combat feels like a toy fight, and the story feels like someone’s first draft, then Starfield might just be the disappointment you never knew you needed. Welcome to the galaxy, where everything’s wrong, but hey, at least the bugs are funny.

]]>
https://hadamard.com/c/starfield-the-galactic-disappointment/feed/ 0 351
Quantum Leap Forward: A New Method for Creating Quantum Entanglement https://hadamard.com/c/quantum-leap-forward-a-new-method-for-creating-quantum-entanglement/ Fri, 20 Dec 2024 23:15:19 +0000 https://hadamard.com/c/?p=892 Continue reading Quantum Leap Forward: A New Method for Creating Quantum Entanglement]]>

In a groundbreaking revelation for the field of quantum physics, researchers have unveiled a novel technique for generating quantum entanglement, the mysterious phenomenon where two or more particles become inextricably linked, sharing a quantum state regardless of the distance between them. This development, which could revolutionize quantum communication networks, promises to enhance the speed, security, and efficiency of information transfer across vast distances. Here’s an in-depth look at this significant advancement.

What is Quantum Entanglement?

Quantum entanglement, often dubbed “spooky action at a distance” by Albert Einstein, is a phenomenon where particles become interconnected in such a way that the quantum state of each particle cannot be described independently of the state of the others, even when separated by large distances. This property is fundamental to quantum mechanics and has been a cornerstone for developing quantum technologies, including quantum computing and quantum cryptography.

The New Method Discovered

The traditional methods for creating quantum entanglement often involve complex setups and significant resources, like the use of entangled photon pairs generated through spontaneous parametric down-conversion or the entanglement of atoms in ultra-cold environments. However, the new method discovered bypasses these complexities:

  • AI-Driven Discovery: Researchers used an AI tool named PyTheus, which was initially developed to reproduce established entanglement-swapping protocols. In a twist, PyTheus suggested a simpler method that didn’t rely on pre-entangled pairs or Bell-state measurements, instead leveraging the indistinguishability of photon paths to create entanglement. This approach was unexpected and significantly reduces the complexity and resource requirements for achieving entanglement.
  • Simplification of Entanglement Generation: Instead of starting with already entangled photons, this method entangles independent photons based on their paths, challenging the established belief that entanglement must be initiated through direct interaction or pre-existing entanglement.

Implications for Quantum Communication Networks

  • Enhanced Security: Quantum entanglement is the bedrock of quantum key distribution (QKD), where secure communication is ensured by the principle that measuring one particle of an entangled pair instantly affects the other, thereby alerting users to any eavesdropping attempts. This new method could make QKD more accessible and less resource-intensive.
  • Long-Distance Quantum Communication: With entanglement being less dependent on specific conditions, this method could facilitate the creation of quantum networks over larger distances. Current limitations in quantum repeaters and the decoherence of quantum states over distance might be mitigated with this technology.
  • Scalability: The simplification could lead to more scalable quantum networks. If entanglement can be generated more easily, the infrastructure for quantum internet might become more practical, allowing for a network where quantum information can be relayed across continents or even globally.

Challenges and Future Research

  • Noise and Environmental Factors: Quantum systems are extremely sensitive to environmental interactions, which can destroy entanglement. Researchers must determine how this method holds up under real-world conditions, including noise from temperature, vibrations, or electromagnetic interference.
  • Scalability to Practical Networks: While the method simplifies entanglement at a microscopic level, scaling this up to a network that can handle practical, everyday communication is a significant hurdle.
  • Integration with Existing Infrastructure: How this new method will integrate with or replace current technologies in quantum computing and communication is yet to be explored in detail.

Conclusion

This new approach to quantum entanglement represents a pivotal moment in quantum science. By making entanglement more accessible and less resource-intensive, it not only opens up new research directions but also hastens the practical application of quantum technologies in communication. The journey from this discovery to a fully operational quantum internet or secure communication network will involve much more research and innovation. However, this step has undeniably shifted what was previously thought possible, bringing the quantum future closer to reality.

]]>
892
Quantum Physics in 2024: A Year of Groundbreaking Discoveries https://hadamard.com/c/quantum-physics-in-2024-a-year-of-groundbreaking-discoveries/ Fri, 20 Dec 2024 11:33:05 +0000 https://hadamard.com/c/?p=884 Continue reading Quantum Physics in 2024: A Year of Groundbreaking Discoveries]]>

The year 2024 has been a landmark period for quantum physics, with advancements that promise to reshape our understanding of the universe and our technological capabilities. Here’s a comprehensive look at the key developments in the field.

Quantum Computing Breakthroughs

  • Error Correction Milestone: Researchers at Google Quantum AI achieved a significant milestone in quantum error correction. By transforming physical qubits into a single logical qubit, they demonstrated that increasing the number of physical qubits can enhance the performance of quantum systems. This could pave the way for more stable and practical quantum computers, bringing us closer to quantum computers that can outperform classical computers in specific tasks.
  • The Willow Chip: Google introduced the Willow chip, which marks a significant advancement in reducing errors in quantum computing. This chip not only increases the number of qubits but also introduces new methods for error correction, potentially making quantum computing more viable for commercial applications.
  • Mechanical Qubit: In a historic moment, physicists created the first fully mechanical qubit, which could have profound implications for quantum computing and the study of quantum entanglement. This development points to new pathways for quantum information processing that do not rely solely on electrical or magnetic properties.

Quantum Algorithms and Theory

  • Energy State Calculation: Physicists have developed a powerful quantum algorithm capable of calculating the local minimum energy state of a quantum system. This not only showcases the potential computational advantage of quantum systems over classical ones but also has implications for understanding complex chemical reactions and material properties at a quantum level.

Experimental Quantum Physics

  • Quantum Fluctuations and Universe Creation: An experiment observed for the first time how quantum fluctuations could trigger transitions akin to those that might have created our universe. This experiment bridges theoretical quantum mechanics with cosmology, offering insights into the mechanisms behind the Big Bang.

Entanglement and Quantum Components

  • Shrinking Quantum Components: A revolutionary method was developed to produce entangled quantum components at a scale reduced by a factor of 1000. This miniaturization could lead to quantum devices that are not only more compact but also more efficient, potentially revolutionizing quantum sensors and communication systems.

Future Implications

The developments in 2024 are not just academic milestones; they are stepping stones towards practical applications that could transform industries. From drug discovery to secure communication and beyond, the potential for quantum technology to solve problems currently intractable for classical computers is becoming increasingly tangible.

These advancements also echo the exponential growth in technology that Dr. Singularity has noted, suggesting that the pace of breakthroughs in quantum physics might continue to accelerate, setting the stage for even more profound innovations in the coming years.

As we move into 2025, the quantum age appears to be truly beginning, with these recent discoveries highlighting both the challenges and the immense possibilities in harnessing quantum mechanics for practical use. The future of quantum physics looks not only bright but also fundamentally transformative.

]]>
884
The science behind Interstellar https://hadamard.com/c/the-science-behind-interstellar/ Sat, 16 Mar 2024 16:30:27 +0000 https://hadamard.com/c/?p=350

You're currently a free subscriber. Upgrade your subscription to get access to the rest of this post and other paid-subscriber only content.

]]>
350
Is 2024 the New 1984 https://hadamard.com/c/is-2024-the-new-1984/ Sat, 09 Mar 2024 16:18:12 +0000 https://hadamard.com/c/?p=775 Continue reading Is 2024 the New 1984]]> 1984

Echoes of Orwell: Examining Parallels Between 1984 and 2024

In the year 2024, the echoes of Orwell’s prophetic masterpiece, “1984,” continue to reverberate through the corridors of society, serving as a chilling reminder of the perils of unchecked power and the erosion of individual freedoms. As we delve into the parallels between Orwell’s dystopian vision and the realities of our own time, it becomes increasingly apparent that the themes and warnings of “1984” remain as relevant today as they were upon its publication over half a century ago.

One of the most striking parallels between “1984” and 2024 lies in the omnipresence of surveillance and the erosion of privacy. In Orwell’s novel, the Party’s surveillance state, epitomized by the ever-watchful Big Brother, monitors every aspect of citizens’ lives, invading even the most intimate spaces. Today, we find ourselves grappling with similar concerns as advancements in technology afford governments and corporations unprecedented access to our personal data. From ubiquitous CCTV cameras to the pervasive tracking of online activities, the boundaries between public and private spheres continue to blur, echoing the omnipresent surveillance of Orwell’s dystopia.

Furthermore, the manipulation of truth and the proliferation of propaganda are recurring themes that resonate deeply with contemporary society. In “1984,” the Party employs Newspeak and doublespeak to control the narrative and suppress dissent, rewriting history to suit its agenda. Similarly, in 2024, we witness the spread of misinformation and the rise of so-called “fake news,” fueled by the proliferation of social media and the erosion of trust in traditional media outlets. The distortion of reality, coupled with the dissemination of alternative facts, undermines the very foundations of democracy, echoing the insidious tactics employed by the Party in Orwell’s fictional world.

Moreover, the erosion of individual autonomy and the suppression of free thought are hallmarks of both Orwell’s dystopia and the realities of our own time. In “1984,” the Thought Police enforce ideological conformity, punishing any deviation from the Party’s orthodoxy with ruthless efficiency. Similarly, in 2024, we witness the rise of censorship and the stifling of dissenting voices, whether through online de-platforming or legislative measures aimed at curbing free speech. The specter of censorship looms large, casting a pall over our ability to engage in open dialogue and challenge prevailing narratives, mirroring the authoritarianism depicted in Orwell’s cautionary tale.

As we confront the parallels between “1984” and 2024, it becomes evident that Orwell’s dystopian vision serves as a stark warning against the dangers of unchecked power and the erosion of individual freedoms. In an age defined by surveillance, manipulation, and suppression, the lessons of “1984” compel us to remain vigilant in the defense of our liberties and to resist the encroachment of authoritarianism in all its forms. Only by heeding these warnings and upholding the principles of truth, freedom, and democracy can we hope to avoid the nightmarish future depicted in Orwell’s seminal work.

Surveillance, Control, and Manipulation: Unveiling Similarities Between 1984 and 2024

In the ever-evolving landscape of surveillance, control, and manipulation, the parallels between George Orwell’s “1984” and the year 2024 are as profound as they are disconcerting. As we peel back the layers of societal structure and technological advancement, it becomes increasingly apparent that Orwell’s dystopian vision serves as a haunting reflection of our own reality.

Surveillance stands as a cornerstone of both “1984” and the contemporary world. In Orwell’s novel, the omnipresent gaze of Big Brother looms over every aspect of citizens’ lives, instilling a pervasive sense of fear and paranoia. Fast forward to 2024, and the proliferation of surveillance technology has reached unprecedented levels, with governments and corporations harnessing the power of cameras, drones, and data analytics to monitor individuals on an unprecedented scale. From facial recognition systems to the tracking of online activity, the boundaries of privacy continue to erode, echoing the surveillance state depicted in Orwell’s cautionary tale.

Control, too, manifests in strikingly similar ways in both “1984” and 2024. In Orwell’s world, the Party exerts its authority through censorship, propaganda, and the suppression of dissent, wielding power with an iron fist. In our own time, we witness the subtle but insidious mechanisms of control at play, from the manipulation of information to the stifling of free expression. Social media algorithms shape our perceptions, echo chambers reinforce ideological divides, and dissenting voices are silenced through a combination of online censorship and real-world consequences. The erosion of individual autonomy and the consolidation of power in the hands of the few echo the authoritarian tendencies depicted in Orwell’s dystopia.

Manipulation, perhaps, serves as the most chilling parallel between “1984” and 2024. In Orwell’s novel, the Party twists truth and distorts reality to maintain its grip on power, employing propaganda and doublespeak to control the narrative. In the contemporary landscape, we witness a similar manipulation of information, with misinformation and disinformation spreading like wildfire across digital platforms. From state-sponsored propaganda to viral conspiracy theories, the lines between fact and fiction become increasingly blurred, undermining the very foundations of truth and trust. The proliferation of fake news and the erosion of trust in institutions echo the manipulative tactics employed by the Party in Orwell’s fictional world.

As we confront the striking similarities between “1984” and 2024, it becomes clear that Orwell’s dystopian vision serves as a cautionary tale for our times. The pervasiveness of surveillance, the erosion of privacy, the exertion of control, and the manipulation of truth all serve as stark reminders of the dangers of unchecked power and the fragility of freedom. In an age defined by technological advancement and societal upheaval, the lessons of “1984” compel us to remain vigilant in the defense of our liberties and to resist the encroachment of authoritarianism in all its forms. Only by confronting the realities of our present can we hope to avoid the nightmarish future depicted in Orwell’s timeless masterpiece.

2024: Are We Living in Orwell’s Nightmare?

In the year 2024, the question looms large: are we living in Orwell’s nightmare? As we navigate the complexities of our modern world, the parallels between George Orwell’s dystopian masterpiece, “1984,” and the realities of our time are impossible to ignore. From the pervasiveness of surveillance to the erosion of privacy and the manipulation of truth, elements of Orwell’s vision seem to echo ominously in our daily lives.

Surveillance, a central theme in “1984,” has become ubiquitous in the digital age of 2024. Governments and corporations alike harness the power of technology to monitor our movements, track our online activities, and amass vast troves of personal data. From facial recognition systems to the proliferation of CCTV cameras, the boundaries between public and private spheres blur as our every move is scrutinized. The specter of Big Brother, ever-watchful and omnipresent, looms large, raising profound questions about the balance between security and individual freedom.

Furthermore, the erosion of privacy in 2024 mirrors the oppressive regime depicted in Orwell’s novel. In an era defined by social media and digital connectivity, personal information has become a commodity, bought, sold, and exploited for profit. The concept of privacy, once sacrosanct, is increasingly under threat as individuals willingly surrender their data in exchange for convenience and connectivity. The pervasiveness of surveillance capitalism and the commodification of personal information raise troubling questions about autonomy and agency in the digital age.

Perhaps most chillingly, the manipulation of truth in 2024 mirrors the tactics employed by the Party in “1984.” In an era of information overload and viral misinformation, distinguishing fact from fiction becomes increasingly difficult. From deepfake videos to algorithmic echo chambers, the dissemination of falsehoods and the distortion of reality undermine the very foundations of democracy. The erosion of trust in institutions and the proliferation of alternative facts sow seeds of division and discord, echoing the insidious propaganda machine of Orwell’s dystopia.

In light of these parallels, the question of whether we are living in Orwell’s nightmare becomes all the more pressing. As we grapple with the implications of surveillance, control, and manipulation in our society, the lessons of “1984” serve as a sobering reminder of the dangers of unchecked power and the fragility of freedom. Whether we heed these warnings and strive to uphold the principles of truth, transparency, and individual autonomy will ultimately determine the course of our collective future.

Freedom vs. Oppression: Exploring the Societal Shifts from 1984 to 2024

Exploring the societal shifts from Orwell’s “1984” to the year 2024 unveils a complex tapestry of evolving dynamics between freedom and oppression. In Orwell’s dystopian masterpiece, the oppressive regime of the Party reigns supreme, crushing individual liberties under the weight of surveillance, control, and manipulation. Fast forward to 2024, and while echoes of Orwell’s nightmare persist, new complexities have emerged, reshaping the landscape of freedom and oppression in profound ways.

In “1984,” the Party’s stranglehold on power is absolute, with citizens subjected to relentless surveillance, ideological indoctrination, and brutal repression. Personal freedoms are nonexistent, as even the most intimate thoughts and actions are scrutinized by the ever-watchful eye of Big Brother. However, in 2024, while elements of surveillance and control persist, the mechanisms of oppression have evolved in response to technological advancements and shifting societal norms.

The advent of the digital age has ushered in new challenges to personal privacy and autonomy, as governments and corporations harness the power of data to monitor and influence individuals on an unprecedented scale. Social media platforms, once heralded as tools of liberation and connectivity, have become battlegrounds for the manipulation of truth and the spread of misinformation. Algorithms shape our online experiences, reinforcing existing biases and narrowing our exposure to diverse perspectives. While the methods may differ from those employed by the Party in “1984,” the end result remains the same: a pervasive erosion of individual agency in the face of increasingly sophisticated forms of control.

Moreover, the struggle for freedom in 2024 extends beyond the realm of government surveillance to encompass broader societal forces, including systemic inequality, social injustice, and the erosion of democratic norms. The widening gap between the haves and have-nots, exacerbated by economic inequality and corporate greed, threatens to further entrench systems of oppression and marginalization. Meanwhile, the rise of populist movements and authoritarian leaders across the globe poses a direct challenge to democratic principles, testing the resilience of institutions designed to safeguard individual rights and freedoms.

]]>
775
VR killed the Social Life https://hadamard.com/c/vr-killed-the-social-life/ Sat, 02 Mar 2024 10:12:08 +0000 https://hadamard.com/c/?p=771 Continue reading VR killed the Social Life]]>

The Virtual Reality Revolution: Is It Destroying Social Interaction?

In the landscape of technological advancements, virtual reality (VR) stands as one of the most immersive and transformative innovations of our time. With its ability to transport users to entirely new realms and experiences, VR has undoubtedly revolutionized entertainment, education, and various industries. However, as VR continues to permeate into our daily lives, a pertinent question arises: is it enhancing or eroding social interaction?

At first glance, VR appears to offer unparalleled opportunities for social connection. Virtual worlds provide platforms for individuals to interact with friends, family, and even strangers in ways previously unimaginable. Whether it’s exploring exotic locales together, collaborating on projects in virtual workspaces, or simply engaging in virtual hangouts, VR fosters a sense of presence and shared experiences regardless of physical distance.

Yet, beneath the surface lies a nuanced reality. While VR facilitates connections in the digital realm, some argue that it may come at the expense of genuine, face-to-face interaction. The immersive nature of VR can blur the lines between virtual and real-life experiences, leading to a detachment from physical surroundings and interpersonal relationships. As users become engrossed in virtual worlds, they may inadvertently neglect the importance of traditional social interactions, such as spending time with loved ones or participating in community activities.

Moreover, concerns have been raised regarding the potential negative impact of prolonged VR use on mental health and well-being. Excessive immersion in virtual environments may contribute to feelings of isolation, disconnection from reality, and even addiction. As individuals retreat further into the virtual realm, they risk sacrificing meaningful social connections and experiences in favor of simulated alternatives.

Nevertheless, it is essential to recognize that the impact of VR on social interaction is not inherently detrimental. Like any tool, its effects depend on how it is utilized and integrated into society. When used mindfully, VR has the potential to complement rather than replace traditional forms of social interaction. It can serve as a catalyst for new forms of collaboration, creativity, and communication, enriching our social lives in ways previously unattainable.

To navigate the evolving landscape of VR and social interaction, it is crucial to foster a balanced approach that harnesses the benefits of technology while preserving the essence of human connection. This entails promoting digital literacy, encouraging responsible usage, and prioritizing genuine interpersonal relationships over virtual substitutes. By doing so, we can ensure that the virtual reality revolution remains a force for positive change, enhancing rather than detracting from the fabric of our social fabric.

Disconnected in a Connected World: How VR Is Changing the Fabric of Social Life

VR

In an era where digital connectivity reigns supreme, virtual reality (VR) emerges as a double-edged sword, reshaping the very essence of social interaction. While touted as a gateway to unparalleled experiences and connections, VR has also introduced a paradoxical dynamic, where individuals find themselves increasingly disconnected in an ostensibly hyper-connected world.

At its core, VR offers an enticing promise: the ability to transcend physical boundaries and immerse oneself in virtual realms teeming with possibilities. From multiplayer gaming to virtual gatherings and remote collaborations, VR facilitates interactions that defy geographical constraints, allowing individuals to forge connections regardless of distance. Yet, as the allure of virtual escapism grows, the line between the digital and physical worlds blurs, giving rise to a disconcerting trend of disconnection.

The immersive nature of VR, while captivating, comes at a cost. As users don headsets and enter virtual environments, they often retreat into solitary experiences, isolated from the tangible presence of others. While virtual interactions may simulate proximity, they often lack the depth and richness of face-to-face encounters, leading to a sense of detachment and alienation from real-world social dynamics.

Moreover, the pervasive nature of VR can exacerbate existing societal divides, widening the gap between those who have access to cutting-edge technology and those who do not. As VR becomes increasingly integrated into various facets of life, from education and entertainment to remote work and socialization, those unable to partake in this digital revolution risk being left behind, further exacerbating feelings of isolation and exclusion.

Furthermore, the immersive allure of VR has raised concerns about its addictive potential and its impact on mental health. As individuals become increasingly engrossed in virtual worlds, they may prioritize virtual interactions over meaningful face-to-face connections, leading to feelings of loneliness, anxiety, and depression. The quest for virtual escapism, while momentarily fulfilling, can ultimately leave individuals feeling more disconnected and disillusioned than ever before.

Yet, amidst these challenges lies an opportunity for reflection and recalibration. Rather than succumbing to the allure of virtual escapism, we must strive to strike a balance between the digital and physical realms, leveraging VR as a tool to enhance, rather than replace, genuine social connections. This entails fostering digital literacy, promoting responsible usage, and prioritizing quality over quantity in our interactions.

Ultimately, the true impact of VR on the fabric of social life lies not in its technological prowess but in how we choose to wield it. By embracing a mindful approach to virtual reality, we can harness its transformative potential to bridge divides, foster empathy, and cultivate authentic connections in an increasingly interconnected yet paradoxically disconnected world.

Virtual Realities, Real Consequences: The Social Ramifications of VR Adoption

The rapid adoption of virtual reality (VR) technology promises to revolutionize how we interact, communicate, and experience the world around us. However, as VR becomes increasingly integrated into our daily lives, it brings with it a host of social ramifications that demand careful consideration.

On the surface, VR appears to offer boundless opportunities for social connection and collaboration. Virtual environments allow users to engage in shared experiences, regardless of physical distance, fostering a sense of presence and interconnectedness. From virtual meetings and collaborative workspaces to immersive social gatherings and multiplayer gaming, VR has the potential to transcend geographical barriers and facilitate meaningful interactions on a global scale.

Yet, beneath this veneer of connectivity lie deeper societal implications that merit scrutiny. As individuals immerse themselves in virtual worlds, they risk blurring the boundaries between the digital and physical realms, leading to a phenomenon known as “virtual isolation.” While virtual interactions may provide a semblance of social connection, they often lack the nuances and richness of face-to-face communication, potentially eroding the quality of interpersonal relationships.

Moreover, the widespread adoption of VR has the potential to exacerbate existing disparities in access and opportunity. As VR technology remains predominantly accessible to those with the means to afford it, there is a risk of widening the digital divide and perpetuating social inequalities. Those who lack access to VR may find themselves excluded from virtual communities and experiences, further marginalizing already disadvantaged groups.

Furthermore, the immersive nature of VR raises concerns about its impact on mental health and well-being. Prolonged exposure to virtual environments may contribute to feelings of disorientation, dissociation, and even addiction. As individuals spend increasing amounts of time in virtual spaces, they may neglect real-world responsibilities and relationships, leading to feelings of loneliness and isolation.

Despite these challenges, it is essential to approach the adoption of VR technology with a critical yet optimistic mindset. While acknowledging the potential pitfalls, we must also recognize the transformative potential of VR to enhance social interaction and collaboration in novel ways. By prioritizing inclusivity, digital literacy, and responsible usage, we can harness the power of VR to bridge divides, foster empathy, and cultivate meaningful connections in an increasingly virtual world.

In essence, the social ramifications of VR adoption are multifaceted and complex, requiring thoughtful consideration and proactive measures to navigate effectively. As we embrace the potential of virtual realities, we must remain vigilant to ensure that the benefits of VR technology are realized without sacrificing the fundamental elements of human connection and social cohesion.

The Loneliness of the Virtual Realm: Examining VR’s Toll on Social Connectivity

In the quest for immersive experiences and digital escapism, virtual reality (VR) has emerged as a captivating frontier. Yet, as users don headsets and venture into virtual realms, a paradoxical reality unfolds: the loneliness of the virtual realm, and the toll it exacts on social connectivity.

At first glance, VR appears to offer a gateway to unparalleled social interaction. Virtual environments promise the opportunity to connect with others in ways that transcend physical boundaries, fostering a sense of presence and shared experiences. From virtual meetups and collaborative projects to multiplayer gaming and social gatherings, VR seemingly offers an antidote to the isolation of modern life.

However, beneath the surface lies a more complex narrative. As individuals immerse themselves in virtual worlds, they often find themselves navigating a landscape marked by virtual isolation. Despite the illusion of connectivity, virtual interactions frequently lack the depth, intimacy, and authenticity of face-to-face communication. As a result, users may experience a profound sense of loneliness, disconnected from the tangible presence of others and the nuances of real-world social dynamics.

Moreover, the immersive nature of VR can exacerbate feelings of isolation and detachment from reality. Prolonged exposure to virtual environments may lead to a blurring of the boundaries between the digital and physical realms, leaving individuals feeling disoriented and disconnected from their surroundings. In the pursuit of virtual escapism, users risk sacrificing meaningful real-world relationships and experiences, further amplifying feelings of loneliness and alienation.

Furthermore, the pervasive nature of VR can perpetuate a cycle of dependency and withdrawal from social interaction. As individuals retreat further into the virtual realm, they may neglect opportunities for genuine connection and engagement in favor of simulated alternatives. This withdrawal from real-world social interactions can have far-reaching consequences for mental health and well-being, contributing to feelings of depression, anxiety, and social isolation.

In confronting the loneliness of the virtual realm, it is imperative to adopt a balanced and mindful approach to VR usage. While virtual reality offers the potential for novel forms of social interaction and expression, it must be complemented by a commitment to nurturing genuine human connections. This entails prioritizing quality over quantity in our interactions, fostering empathy and understanding, and recognizing the importance of real-world relationships in combating loneliness and fostering a sense of belonging.

Ultimately, the loneliness of the virtual realm serves as a sobering reminder of the complexities inherent in our relationship with technology. As we navigate the ever-expanding frontier of virtual reality, we must remain vigilant in safeguarding the bonds that connect us, ensuring that the allure of the digital realm does not come at the expense of our fundamental need for genuine human connection and social connectivity.

]]>
771
Fashionista’s Guide to Surviving Armageddon in 6 steps! https://hadamard.com/c/a-fashionistas-guide-to-surviving-armageddon/ https://hadamard.com/c/a-fashionistas-guide-to-surviving-armageddon/#respond Sat, 24 Feb 2024 13:27:00 +0000 https://hadamard.com/c/?p=352 Continue reading Fashionista’s Guide to Surviving Armageddon in 6 steps!]]> The begin of Armageddon

Ah, the end of the world as we know it, Armageddon, what a perfect excuse to revamp your wardrobe! Forget the doom and gloom; it’s time to embrace Nuclear Fallout Chic and strut your stuff in the post-apocalyptic runway of survival. Who says you can’t look fabulous while dodging radiation?

Picture yourself strolling through the desolate streets, each step a testament to your fearless spirit and impeccable taste. With a keen eye for style and a dash of creativity, you can turn even the bleakest of landscapes into your own personal runway.

Start by embracing utilitarian chic with rugged cargo pants, distressed denim jackets, and combat boots that exude both style and functionality. Layer up with oversized sweaters and chunky scarves for added warmth, while still maintaining that effortlessly cool aesthetic.

Radiation-Resistant Runway Ready Wear:

The first rule of surviving nuclear fallout is to look good doing it. Ditch those hazmat suits they’re so last apocalypse. Instead, opt for radiation-resistant fashion that’ll have you turning heads as you outrun the fallout. Think metallic fabrics, shimmering in the radiation glow – who says you can’t be fashionable and functional?

Gone are the days of drab, utilitarian attire – it’s time to embrace the beauty of metallic fabrics that catch the light and gleam like beacons of hope in the darkness. From shimmering silver jumpsuits to futuristic chrome jackets, there’s no limit to the stunning looks you can rock in the aftermath of nuclear devastation.

But radiation-resistant fashion isn’t just about looking good – it’s about protecting yourself from the dangers of the fallout. Opt for clothing and accessories made from advanced materials that shield you from harmful radiation while still allowing you to move with ease and grace.

And who says you can’t add a touch of glamour to your survival gear? Adorn yourself with sparkling jewelry and accessories that elevate your look and make a statement in the midst of chaos. From metallic belts to shimmering scarves, every detail counts when it comes to surviving in style.

Accessorize Like It’s the End of the World:

Gas masks are so basic. Go for avant-garde breathing apparatuses that not only filter out toxins but also make a statement. Bedazzle your Geiger counter – because if you’re going down, you might as well go down in style.

But why stop there? Take your survival gear to the next level by bedazzling your Geiger counter – because if you’re going down, you might as well go down in style. Transform your radiation detection device into a dazzling accessory that not only keeps you informed of potential hazards but also adds a touch of glamour to your ensemble.

Whether you opt for bold colors, geometric patterns, or sparkling embellishments, your bedazzled Geiger counter is sure to turn heads and spark conversation wherever you go. After all, who says survival gear can’t be fashionable?

So ditch the basic gas masks and embrace the avant-garde world of post-apocalyptic fashion. With breathing apparatuses that make a statement and bedazzled Geiger counters that add a touch of flair, you’ll be ready to navigate the fallout with style and grace. After all, in a world where survival is paramount, why not do it in style?

DIY Fallout Shelter Makeover:

Your fallout shelter doesn’t have to be drab and dreary. Turn it into a Pinterest-worthy refuge! Throw in some fairy lights, plush rugs, and an accent wall with post-apocalyptic graffiti. Who knew a nuclear bunker could be so chic? You’ll be the envy of your irradiated neighbors.

But why stop there? Elevate your shelter with stylish storage solutions, sleek furniture pieces, and decorative accents that add personality to your space. From vintage-inspired decor to modern minimalist touches, the possibilities are endless when it comes to turning your fallout shelter into a chic sanctuary.

With a little creativity and ingenuity, you can create a space that not only provides safety and security but also serves as a haven of comfort and style in the midst of chaos. Who knew a nuclear bunker could be so chic?

So go ahead, unleash your inner interior designer and transform your fallout shelter into the envy of your irradiated neighbors. With fairy lights, plush rugs, and post-apocalyptic graffiti, you’ll be living in style while the world above descends into chaos. After all, why sacrifice style when you can survive in luxury?

Canned Goods Can be Couture:

Survival doesn’t mean sacrificing your culinary tastes. Those canned goods you hoarded? Turn them into a culinary fashion show! Create avant-garde can sculptures or host a canned food cooking competition. Who said the end of the world can’t be deliciously stylish?

With each dish and sculpture serving as a testament to your creativity and resourcefulness, you’ll prove that the end of the world can be deliciously stylish. Who knew canned goods could be so chic?

So go ahead, unleash your inner chef and artist, and turn your canned goods into a culinary fashion show that’s sure to impress even the most discerning of palates. After all, in a world where survival is paramount, why not indulge in a little culinary creativity along the way?

Mutant Pet Makeovers:

Mutated animals roaming the wasteland? Embrace the chance to turn them into your fashionable sidekicks! Bedazzle your irradiated iguana or fashion a stylish scarf for your three-eyed cat. The new world is your runway, and mutant pets are the must-have accessories.

With your mutant pet as your stylish companion, you’ll be ready to conquer the new world with confidence and flair. After all, in a world where survival is paramount, why not surround yourself with fashionable companions who make the journey a little more glamorous?

Post-Apocalyptic Dance Parties:

What better way to survive the nuclear fallout than with a dance party? Crank up the radioactivity and get your groove on in the safety of your fallout shelter. Who cares if the world is ending? You’ll be twerking your way through Armageddon like it’s your last day on Earth – because, well, it might be.

So go ahead, let loose and dance like there’s no tomorrow – because in a world where every day is a gamble, why not make the most of the time you have? With a dance party to remember, you’ll be turning Armageddon into the ultimate celebration of life, love, and the power of human resilience.

Conclusion:

Surviving the nuclear fallout doesn’t mean surrendering your sense of style. Embrace the chaos, redefine fashion, and make a statement that’ll outshine the radiation. Who knows, maybe your fallout chic ensemble will be the inspiration for the next generation of survivors. After all, when life gives you nuclear fallout, make it a fashion show!

]]>
https://hadamard.com/c/a-fashionistas-guide-to-surviving-armageddon/feed/ 0 352
Quantum repeaters, definitely the Holy Grail?! https://hadamard.com/c/quantum-repeaters-definitely-the-holy-grail/ Wed, 21 Feb 2024 16:08:26 +0000 https://hadamard.com/c/?p=721 Continue reading Quantum repeaters, definitely the Holy Grail?!]]> Quantum repeaters holds the promise of revolutionizing secure data transmission by leveraging the principles of quantum mechanics. Unlike classical communication, which relies on the transmission of classical bits (0s and 1s), quantum communication utilizes quantum bits or qubits, which can exist in multiple states simultaneously due to superposition and entanglement. However, despite its potential, quantum communication faces several challenges, particularly concerning the degradation of quantum signals over long distances. In this article, we explore the necessity for quantum repeaters to overcome these challenges and facilitate the widespread adoption of quantum communication.

Quantum repeaters
  1. Quantum Communication and its Challenges: Quantum communication relies on the transmission of quantum states over long distances. However, quantum signals are susceptible to decoherence and attenuation as they travel through optical fibers, limiting the achievable transmission distances. Additionally, quantum signals are vulnerable to noise and eavesdropping, necessitating robust methods for secure communication.
  2. The Need for Quantum Repeaters: Quantum repeaters are essential components in extending the range of quantum communication networks. By employing quantum entanglement swapping and purification techniques, quantum repeaters can regenerate and amplify quantum signals, mitigating the effects of signal loss and decoherence. Moreover, quantum repeaters enable the establishment of secure quantum channels over intercontinental distances, facilitating secure communication protocols such as quantum key distribution (QKD).
  3. Challenges in Quantum Repeater Implementation: Despite their potential benefits, the practical implementation of quantum repeaters poses several challenges. Engineering reliable quantum repeater nodes capable of maintaining long-lived quantum states and performing entanglement operations with high fidelity remains a significant technical hurdle. Furthermore, the integration of quantum repeaters into existing communication infrastructures requires careful consideration of compatibility and scalability.
  4. Strategies for Overcoming Challenges: To address the challenges associated with quantum repeaters, ongoing research focuses on developing novel quantum repeater protocols and technologies. Advancements in quantum error correction codes, quantum memory devices, and quantum network architectures are crucial for realizing practical quantum repeater systems. Additionally, interdisciplinary collaborations between quantum physicists, engineers, and computer scientists are essential for driving innovation in quantum communication technologies.

Conclusion: In conclusion, quantum repeaters play a pivotal role in overcoming the challenges of quantum communication, enabling the realization of secure and efficient quantum networks. While significant progress has been made in the development of quantum repeater technologies, further research and development efforts are needed to fully harness the potential of quantum communication for applications ranging from secure data transmission to quantum internet infrastructure.

Unraveling the Quantum Tango: Entanglement in Quantum Communication

Figures-showing-the-real-space-and-reciprocal-space-images-of-an-MQWL-laser-array-on-

In the ethereal realm of quantum mechanics, entanglement emerges as a mesmerizing phenomenon, challenging our classical intuitions and revolutionizing the landscape of communication. At its core, entanglement embodies a dance of particles, where the properties of one become inherently linked with another, irrespective of the distance separating them. This entwined state, described famously by Einstein as “spooky action at a distance,” forms the bedrock of quantum communication, promising unparalleled security and efficiency.

Entanglement springs forth from the quantum superposition principle, where particles exist in multiple states simultaneously until observed, akin to Schrödinger’s enigmatic cat both alive and dead. When two or more particles become entangled, their quantum states become intertwined, so that the measurement of one instantaneously influences the state of the other, regardless of the spatial expanse between them. This intricate connection persists even when the entangled particles are light-years apart, defying classical notions of locality.

The significance of entanglement in quantum communication is profound. Quantum entanglement serves as the cornerstone of quantum cryptography, enabling secure communication channels resistant to eavesdropping attempts. By encoding information onto entangled particles and transmitting them to remote locations, quantum communication promises unbreakable encryption, safeguarding sensitive data in an era threatened by quantum computing’s potential to unravel classical encryption schemes.

However, amidst the promise lie formidable challenges in maintaining entanglement over long distances. Quantum decoherence, the relentless interaction of entangled particles with their surrounding environment, poses a formidable obstacle. As entangled particles traverse vast distances through mediums rife with noise and interference, their delicate quantum states degrade, unraveling the entanglement and undermining the fidelity of communication channels.

Efforts to mitigate decoherence and preserve entanglement span a spectrum of innovative techniques. Quantum error correction codes offer a robust defense mechanism against environmental perturbations, encoding redundancy into quantum information to detect and rectify errors. Quantum repeaters emerge as heralds of long-distance quantum communication, orchestrating the faithful transmission of entanglement over extended distances by segmenting the communication channel into manageable segments and purifying entangled states along the way.

Moreover, the burgeoning field of quantum memory heralds breakthroughs in storing and retrieving quantum information, paving the way for sustained entanglement over vast distances. From solid-state systems to trapped ions and optical cavities, diverse platforms vie for supremacy in the quest for resilient quantum memories capable of preserving entanglement’s delicate dance.

As we navigate the labyrinthine realm of quantum communication, entanglement stands as both enigma and savior, promising unparalleled security while confronting the formidable challenge of maintaining its delicate embrace over cosmic expanses. In the interplay of theory and experiment, scientists unravel the mysteries of entanglement, forging new frontiers in quantum communication and heralding a future where the dance of entangled particles shapes the fabric of secure communication.

Quantum Communication: Navigating the Cosmos with Entangled Qubits

Bell Test

In the quest for secure and efficient communication across cosmic expanses, quantum mechanics unveils a mesmerizing array of phenomena harnessed to encode, transmit, and protect information. At the heart of this endeavor lie three key pillars: quantum memories, entanglement swapping, and error correction techniques in quantum repeaters. Together, they constitute the backbone of long-distance quantum communication, enabling the faithful transmission of quantum information amidst the relentless onslaught of noise and decoherence.

Quantum memories stand as the custodians of quantum information, poised to capture and retain the delicate quantum states essential for communication. These remarkable devices harness the principles of quantum superposition and entanglement to store qubits—the fundamental units of quantum information—within their confines. From solid-state systems to atomic ensembles and optical cavities, diverse platforms vie for supremacy in the quest for resilient quantum memories capable of enduring the ravages of decoherence while faithfully preserving quantum information

Current state-of-the-art quantum repeater implementations:

This section would highlight the most advanced quantum repeaters for communication that have been developed to date. It would include descriptions of experimental setups, key technologies used, and notable achievements. For example, recent implementations might utilize trapped ions, neutral atoms, or solid-state systems as qubits, and they may demonstrate entanglement distribution over significant distances. Additionally, this section might discuss successful demonstrations of entanglement swapping and quantum error correction within these systems.

Innovations driving progress in quantum repeater technology:

Here, the focus would be on recent breakthroughs and advancements that are pushing the boundaries of quantum repeater capabilities. This could include new techniques for extending qubit coherence times, improving entanglement generation rates, or enhancing the fidelity of quantum operations. Innovations in materials science, quantum control techniques, and hardware integration may also be highlighted. Additionally, this section could discuss novel approaches to quantum repeater architecture, such as modular designs or hybrid systems that combine different qubit platforms.

Challenges and ongoing research efforts:

This section would address the remaining obstacles and open research questions in the field of quantum repeaters. It might discuss challenges related to scalability, efficiency, and reliability, such as the difficulty of scaling up entanglement distribution to larger networks or the impact of noise and decoherence on repeater performance. Ongoing research efforts to overcome these challenges could include theoretical studies, experimental investigations, and interdisciplinary collaborations. Additionally, this section might explore potential future directions for quantum repeater research, such as integrating repeater networks with other quantum technologies.

leveraging quantum machine learning to optimize repeater operations.

Overall, these sections provide a comprehensive overview of the current state, recent advancements, and future prospects of quantum repeater technology, offering insights into both the achievements and the remaining challenges in this exciting field.

In the vast expanse of the quantum realm, where particles dance in states of uncertainty and entanglement weaves a tapestry of interconnectedness, there exists a remarkable construct known as the Quantum Bridge. Picture, if you will, a shimmering pathway stretching across the cosmos, linking distant points in space with threads of quantum entanglement, guided by the intricate mechanisms of quantum repeaters.

Imagine yourself as a traveler, navigating this ethereal bridge that transcends the limitations of classical communication. As you step onto its ephemeral surface, you feel the faint hum of quantum entanglement resonating through your being, a sensation both exhilarating and humbling. Each step forward is a leap into the unknown, guided by the enigmatic forces of quantum mechanics.

In this journey, the Quantum Bridge reveals itself as more than just a conduit for information—it is a portal to a realm where the ordinary rules of physics no longer apply. Here, entangled particles communicate instantaneously, their fates intertwined across vast distances. Through the lens of storytelling, we embark on a voyage of discovery, peering into the heart of quantum repeaters and unraveling the mysteries of entanglement.

As we traverse the Quantum Bridge, we encounter nodes of quantum repeater stations, each a beacon of light in the darkness of space. These stations, like lighthouses in a stormy sea, serve as waypoints for entangled photons, amplifying and distributing their quantum states with precision and care. Through the art of storytelling, we delve into the inner workings of these stations, where quantum memories store the fleeting echoes of entanglement, and entanglement swapping mechanisms weave intricate patterns of connectivity.

But beyond the technical marvels lies a deeper truth—a testament to the profound interconnectedness of the universe. Through the narrative lens, we glimpse the beauty of entanglement, where particles separated by vast distances remain inextricably linked, their destinies entwined in a cosmic dance. It is a story of unity in diversity, where the boundaries between here and there blur into insignificance, and the notion of separateness fades into obscurity.

As we reach the end of our journey, we emerge from the Quantum Bridge with a newfound appreciation for the wonders of quantum mechanics. Through the power of storytelling, we have transcended the complexities of the quantum realm, embarking on a voyage of imagination and discovery. And though the Quantum Bridge may remain shrouded in mystery, its echoes reverberate through the fabric of space and time, a testament to the boundless potential of human creativity and scientific inquiry.

Predictions for the Evolution of Quantum Repeater Technology: Quantum repeaters serve as the cornerstone for long-distance quantum communication by effectively extending the range of quantum entanglement beyond the limitations imposed by decoherence. Looking ahead, we anticipate significant strides in enhancing the efficiency, scalability, and fidelity of quantum repeater systems.

]]>
721
Become the best Quantum Programmer https://hadamard.com/c/become-a-quantum-programmer/ https://hadamard.com/c/become-a-quantum-programmer/#respond Sat, 17 Feb 2024 09:10:20 +0000 https://hadamard.com/c/?p=628 Continue reading Become the best Quantum Programmer]]> Become the best Quantum Programmer

In the vast expanse of the technological landscape, one realm stands out as both enigmatic and promising – Quantum Computing. It is the embodiment of scientific marvels, promising a revolution in computation that transcends the limitations of classical computing. As we delve into the intricacies of this frontier, one cannot overlook the pivotal role of Quantum Programming, the language that unlocks the potential of these quantum machines.

Quantum Computing operates on the fundamental principles of quantum mechanics, a branch of physics that describes the behavior of matter and energy at the smallest scales. At the core of quantum computing lies the quantum bit or qubit, the quantum analog of the classical binary bit. Unlike classical bits, which can only exist in states of 0 or 1, qubits can exist in a superposition of both states simultaneously, owing to the principles of quantum superposition. This property exponentially increases the computational power of quantum computers, enabling them to perform complex calculations at unparalleled speeds.

Quantum Programming emerges as the bridge between human intellect and quantum machines, enabling us to harness the raw potential of quantum computing. Unlike conventional programming paradigms, Quantum Programming operates in the domain of quantum mechanics, necessitating a paradigm shift in the way algorithms are formulated and executed.

One of the foundational concepts in Quantum Programming is quantum entanglement, a phenomenon where qubits become correlated to each other in such a way that the state of one qubit instantaneously influences the state of another, regardless of the distance between them. This property forms the basis of quantum algorithms, allowing for the development of novel approaches to problem-solving, such as Shor’s algorithm for integer factorization and Grover’s algorithm for database search, which showcase the inherent superiority of quantum computation in certain tasks.

Another key concept in Quantum Programming is quantum parallelism, which exploits the ability of qubits to exist in multiple states simultaneously. This parallelism enables quantum algorithms to explore multiple solutions to a problem simultaneously, vastly outperforming classical algorithms in certain computational tasks.

However, Quantum Programming also poses unique challenges. Quantum systems are highly sensitive to noise and errors, leading to the phenomenon of decoherence, where quantum states degrade over time due to interactions with the environment. Quantum programmers must develop techniques to mitigate these errors through error correction codes and fault-tolerant algorithms, ensuring the reliability and robustness of quantum computations.

As we stand on the brink of a new era in computing, Quantum Programming represents not only a technological advancement but a paradigm shift in our understanding of computation itself. It challenges us to rethink the very fabric of reality and explore the mysteries of the quantum realm. With each line of code, we embark on a journey into the unknown, pushing the boundaries of human knowledge and endeavoring to unlock the full potential of the quantum leap in computing.

Understanding Quantum Computing

In the labyrinthine world of quantum mechanics lies the key to a computational revolution – Quantum Computing. It’s a realm where the ordinary rules of classical physics blur, giving rise to mind-bending phenomena and unparalleled computational power. To comprehend this frontier, we must embark on a journey through the mysteries of quantum mechanics, unraveling the enigmatic nature of quantum bits (qubits), and deciphering the language of Quantum Programming through quantum gates and circuits.

Quantum mechanics, the cornerstone of Quantum Computing, defies the intuitions of classical physics. At its heart lies the concept of superposition, where quantum entities like particles or qubits can exist in multiple states simultaneously. This property forms the bedrock of quantum computation, allowing qubits to encode and process vast amounts of information in parallel, leading to exponential computational speedups over classical systems.

Quantum Bits, or qubits, serve as the building blocks of quantum computation. Unlike classical bits, which are bound to either a 0 or 1 state, qubits can exist in a superposition of both states simultaneously. This superposition endows qubits with immense computational power, enabling quantum computers to explore a multitude of solutions to a problem simultaneously, revolutionizing the landscape of computation.

However, the true magic of Quantum Computing unfolds through the manipulation of qubits using Quantum Gates and Circuits, the language of Quantum Programming. Quantum gates are the quantum analogs of classical logic gates, operating on qubits to perform specific quantum operations. These operations include rotations, flips, and entanglement, each playing a crucial role in quantum algorithms and computations.

Quantum circuits, composed of interconnected quantum gates, orchestrate the intricate dance of qubits, guiding them through a choreographed sequence of operations to execute quantum algorithms. These circuits encode the essence of Quantum Programming, translating abstract mathematical concepts into tangible quantum operations, and unlocking the computational potential of quantum computers.

Quantum Programming is not without its challenges.

Quantum systems are inherently fragile, susceptible to noise, decoherence, and errors that can disrupt quantum computations. Quantum programmers must navigate this treacherous landscape, devising error-correction techniques, fault-tolerant algorithms, and quantum error correction codes to ensure the reliability and accuracy of quantum computations.

In the quest to understand Quantum Computing, we embark on a journey through the quantum realm, where the laws of classical physics dissolve, and the mysteries of the quantum universe unfold. Through the language of Quantum Programming, we harness the power of quantum mechanics, unlocking the potential of quantum bits and circuits to redefine the boundaries of computation and propel humanity into a new era of technological innovation.

Bridging Classical and Quantum Worlds

In the ever-evolving landscape of programming, a new frontier emerges – Quantum Programming. It stands as a testament to the convergence of classical and quantum worlds, challenging traditional paradigms and offering new perspectives on problem-solving through quantum algorithms and simulation. To grasp the essence of Quantum Programming, we delve into the foundational concepts that underpin this revolutionary field.

Classical vs. Quantum Programming Paradigms:

Classical programming operates within the confines of deterministic logic, where algorithms progress sequentially, executing instructions one after another. In contrast, Quantum Programming transcends classical boundaries, harnessing the principles of quantum mechanics to manipulate qubits and explore multiple computational paths simultaneously. This paradigm shift introduces novel concepts such as superposition, entanglement, and quantum parallelism, redefining the very nature of computation and opening doors to unprecedented computational power.

Quantum Programmer

Quantum Algorithms: Shifting Perspectives on Problem Solving: At the heart of Quantum Programming lies the art of crafting quantum algorithms – algorithms tailored to exploit the unique properties of quantum systems. These algorithms challenge traditional notions of problem-solving, offering exponential speedups for certain computational tasks. Shor’s algorithm, for example, revolutionizes integer factorization, threatening classical encryption methods with its ability to efficiently factor large numbers. Grover’s algorithm, on the other hand, transforms database search, showcasing the inherent advantage of quantum parallelism in exploring vast solution spaces. As quantum algorithms continue to evolve, they reshape our understanding of computational complexity, offering new avenues for tackling previously intractable problems.

Quantum Simulation, replicating Nature’s Complexity:

Beyond problem-solving, Quantum Programming unlocks the power of quantum simulation, allowing us to replicate the intricacies of natural phenomena with unprecedented fidelity. Quantum simulators emulate complex quantum systems, providing insights into quantum materials, chemical reactions, and biological processes that defy classical simulation methods. From modeling the behavior of electrons in superconductors to predicting the behavior of molecules in drug design, quantum simulation promises to revolutionize fields ranging from materials science to pharmaceuticals, offering a glimpse into the inner workings of the quantum universe.

As we navigate the foundations of Quantum Programming, we embark on a journey into uncharted territory, where the rules of classical computation no longer apply, and the mysteries of the quantum realm beckon. Through quantum algorithms and simulation, we push the boundaries of problem-solving, paving the way for a future where quantum computers unravel the complexities of nature and empower humanity to tackle challenges once deemed insurmountable.

Tools and Technologies in Quantum Computing

In the ever-expanding universe of quantum computing, a myriad of tools and technologies have emerged, empowering enthusiasts and researchers alike to delve into the depths of this transformative field. From quantum programming languages to development kits and quantum hardware, each component plays a vital role in facilitating hands-on exploration and innovation in quantum computation.

Quantum Programming Languages: From Qiskit to Cirq Quantum programming languages serve as the gateway to quantum computation, providing developers with the means to express algorithms and interact with quantum hardware. Among the notable languages are Qiskit, developed by IBM Quantum, and Cirq, pioneered by Google Quantum AI. Qiskit offers a high-level, Python-based interface, making quantum programming accessible to a wide audience and providing comprehensive libraries for quantum algorithm design and execution. On the other hand, Cirq boasts a more low-level approach, offering fine-grained control over quantum circuits and enabling developers to experiment with advanced quantum operations and optimizations. These languages, along with others like Quipper and Q# (Microsoft Quantum Development Kit), democratize quantum programming and foster a vibrant community of quantum enthusiasts and researchers worldwide.

Quantum Development Kits: Enabling Hands-On Exploration Quantum development kits (QDKs) provide a suite of tools and resources for hands-on exploration and experimentation in quantum computing. These kits typically include simulators for quantum circuit simulation, quantum compilers for optimizing quantum code, and interfaces for interacting with quantum hardware. IBM Quantum Experience and Google Quantum Computing Playground are exemplary QDKs that offer intuitive interfaces for designing and simulating quantum circuits, allowing users to gain insights into quantum phenomena and algorithms without the need for physical quantum hardware. Furthermore, these platforms often provide educational resources, tutorials, and community forums, fostering learning and collaboration among quantum enthusiasts of all skill levels.

Quantum Hardware: Navigating the Landscape of Quantum Processors At the heart of quantum computing lies quantum hardware – physical devices capable of manipulating and measuring qubits. Quantum processors, such as IBM Quantum’s Q System One and Google’s Sycamore, represent the pinnacle of quantum engineering, boasting impressive qubit counts and coherence times. These devices harness superconducting circuits, trapped ions, and other exotic phenomena to create and control qubits, paving the way for practical quantum computation. However, navigating the landscape of quantum hardware presents unique challenges, including qubit connectivity, error rates, and calibration complexity. As quantum hardware continues to evolve, researchers strive to overcome these challenges and unlock the full potential of quantum computing for real-world applications.

In the realm of quantum computing, tools and technologies serve as the bedrock of exploration and innovation, empowering enthusiasts and researchers to push the boundaries of what’s possible. From quantum programming languages and development kits to quantum hardware, each component plays a crucial role in democratizing quantum computation and propelling humanity towards a future where quantum technologies revolutionize the world.

As quantum computing continues to march forward, it encounters a myriad of challenges on its path to revolutionize the world of computation. From the fundamental issue of quantum noise to the integration of classical and quantum systems, and the exploration of quantum programming beyond traditional computing applications, the future of quantum computing is ripe with both obstacles and opportunities.

Overcoming Quantum Noise: The Quest for Scalable Quantum Computing One of the foremost challenges in quantum computing is mitigating the disruptive effects of quantum noise. Quantum systems are inherently fragile, susceptible to environmental disturbances that can introduce errors and degrade the fidelity of quantum computations. Researchers are actively pursuing techniques such as error correction codes, fault-tolerant algorithms, and decoherence mitigation strategies to combat quantum noise and pave the way for scalable quantum computing. By enhancing qubit coherence times, reducing error rates, and improving fault tolerance, scientists aim to realize the full potential of quantum computers and unlock their transformative power across various domains.

Bridging the Gap: Integrating Classical and Quantum Systems While quantum computing holds immense promise, its seamless integration with classical computing systems remains a formidable challenge. Bridging the gap between classical and quantum worlds requires the development of hybrid computing architectures that leverage the strengths of both paradigms. Hybrid quantum-classical algorithms, such as variational quantum algorithms and quantum-classical optimization techniques, serve as stepping stones towards this integration, enabling classical computers to orchestrate and interface with quantum computations effectively. By establishing robust communication protocols, optimizing resource allocation, and harmonizing classical and quantum software stacks, researchers strive to create synergistic computing ecosystems that harness the combined power of classical and quantum systems.

Quantum Programming Beyond Computing: Applications in Finance, Healthcare, and More Beyond the realm of traditional computing, quantum programming holds the potential to revolutionize a myriad of industries, including finance, healthcare, materials science, and beyond. Quantum algorithms for portfolio optimization, risk analysis, and option pricing promise to revolutionize financial markets, offering unprecedented speedups and insights for decision-makers. In healthcare, quantum computing holds the promise of accelerating drug discovery, optimizing treatment plans, and deciphering the complexities of biological systems with unparalleled precision. Moreover, quantum programming techniques such as quantum machine learning and quantum-inspired optimization algorithms are poised to transform industries ranging from logistics and supply chain management to cybersecurity and telecommunications. As quantum programming matures and quantum hardware advances, the horizon of possibilities expands, heralding a future where quantum technologies drive innovation and reshape the fabric of society.

In the face of these challenges and opportunities, the future of quantum computing is both exciting and uncertain. However, with continued research, collaboration, and innovation, we are poised to overcome obstacles, unlock new frontiers, and harness the full potential of quantum computing to tackle some of the most pressing challenges facing humanity.

Become a Quantum Programmer

Becoming a Quantum Programmer is an exciting journey into the cutting-edge realm of quantum computing, where traditional boundaries of computation blur, and new frontiers of exploration unfold. To embark on this journey, one must navigate a path rich with learning resources, hands-on practice, and active participation in the vibrant quantum community.

Learning Resources and Educational Pathways The first step towards becoming a Quantum Programmer involves immersing oneself in learning resources and educational pathways tailored to quantum computing. Numerous online courses, tutorials, and textbooks offer comprehensive introductions to quantum mechanics, quantum algorithms, and quantum programming languages. Platforms like IBM Quantum Experience, Microsoft Quantum Development Kit, and Google Quantum Computing Playground provide interactive learning environments where aspiring quantum programmers can experiment with quantum circuits, execute quantum algorithms, and gain hands-on experience with real quantum hardware. Additionally, academic institutions and research organizations offer specialized degree programs and workshops in quantum computing, providing structured pathways for formal education and research in the field.

Hands-On Practice: From Quantum Circuits to Quantum Algorithms Building proficiency as a Quantum Programmer requires hands-on practice with quantum circuits, algorithms, and programming languages. Quantum simulators and development kits offer invaluable tools for exploring quantum phenomena, designing quantum circuits, and implementing quantum algorithms. By experimenting with quantum gates, measuring qubits, and analyzing quantum states, aspiring quantum programmers develop intuition and expertise in quantum mechanics and quantum computation. Moreover, tackling real-world problems through quantum programming challenges and hackathons provides opportunities to apply theoretical knowledge to practical problems, fostering creativity, problem-solving skills, and collaboration within the quantum community.

Joining the Quantum Community: Collaborating and Innovating in the Field Active participation in the quantum community is essential for aspiring Quantum Programmers to collaborate, share ideas, and innovate in the field. Quantum forums, online communities, and social media platforms serve as hubs for networking, knowledge exchange, and collaboration among quantum enthusiasts, researchers, and industry professionals. Engaging in discussions, attending conferences, and contributing to open-source quantum software projects enable aspiring quantum programmers to stay updated on the latest developments, connect with experts, and make meaningful contributions to the advancement of quantum computing. By joining forces with like-minded individuals and embracing a culture of openness, curiosity, and collaboration, aspiring Quantum Programmers can play a pivotal role in shaping the future of quantum computing and unlocking its transformative potential.

In summary, becoming a Quantum Programmer is a journey filled with learning, exploration, and collaboration. By leveraging learning resources, engaging in hands-on practice, and actively participating in the quantum community, aspiring quantum programmers can embark on a path of continuous growth, discovery, and innovation in the fascinating realm of quantum computing.

]]>
https://hadamard.com/c/become-a-quantum-programmer/feed/ 0 628
Quantum Key Distribution: How Business Owners Can Empower Their Enterprises with the Potential of Quantum Physics https://hadamard.com/c/quantum-key-distribution-how-business-owners-can-empower-their-enterprises-with-the-potential-of-quantum-physics/ Wed, 14 Feb 2024 13:05:05 +0000 https://hadamard.com/c/?p=706 Continue reading Quantum Key Distribution: How Business Owners Can Empower Their Enterprises with the Potential of Quantum Physics]]>

How can you use Quantum Key Distribution for your business?

How Business Owners Can Harness the Power of Quantum Physics is a comprehensive guide that explores the potential of quantum physics for business growth. This informative document offers valuable insights into the realm of quantum innovations and their practical applications in the business world. With a formal tone, it provides business owners with a clear understanding of how to leverage the power of quantum physics to drive innovation and achieve success.

Quantum Key Distribution System
Quantum Key Distribution System

This informative document offers valuable insights into the realm of quantum innovations and their practical applications in the business world. With a formal tone, it provides business owners with a clear understanding of how to leverage the power of quantum physics to drive innovation and achieve success.

The Innovations in Quantum Physics

Quantum physics has revolutionized the way we understand the fundamental nature of reality. It has given rise to groundbreaking innovations that have the potential to transform various industries, including business. With its unique principles and concepts, quantum physics offers new possibilities for businesses to enhance their operations and drive innovation. By harnessing the power of quantum physics, business owners can tap into quantum computing, quantum algorithms, and quantum sensing technologies, which can optimize processes, accelerate problem-solving, and provide a competitive edge in the market. Implementing these innovations can lead to improved efficiency, increased productivity, and the ability to tackle complex challenges with greater precision and speed.

Quantum Computing

Quantum Algorithms – Ronald de Wolf

Quantum computing is a paradigm-shifting technology that utilizes the principles of quantum mechanics to perform computations at an unprecedented speed and scale. It leverages quantum bits, or qubits, which can exist in multiple states simultaneously, allowing for exponential processing power. By harnessing the potential of quantum computing, businesses can tackle complex problems and perform calculations that were previously infeasible with classical computers. This technology holds the promise to revolutionize fields such as cryptography, optimization, machine learning, and drug discovery, paving the way for significant advancements in various industries.

Quantum Encryption

Quantum encryption is a cutting-edge security technology that leverages the principles of quantum mechanics to provide unbreakable encryption for data transmission. It utilizes the quantum properties of particles to ensure secure communication, making it virtually impossible for hackers to intercept or decipher the transmitted information. Quantum encryption offers a level of security that surpasses traditional encryption methods, providing businesses and individuals with peace of mind when it comes to protecting their sensitive data. This technology has the potential to revolutionize the field of cybersecurity and safeguard confidential information in an increasingly interconnected world.

Applications for Business Owners

Quantum cryptography can have numerous applications for business owners. Firstly, it can enhance the security of sensitive business information, such as financial records, trade secrets, and customer data, reducing the risk of data breaches and cyberattacks. Secondly, it can enable secure communication channels between businesses, ensuring that sensitive information shared during collaborations or negotiations remains confidential. Additionally, quantum encryption can be utilized in the financial industry to secure online transactions and protect against fraudulent activities. Lastly, this technology can bolster the trust and credibility of businesses by demonstrating their commitment to safeguarding customer privacy and data.

Quantum encryption has the potential to revolutionize cybersecurity and protect confidential information in an interconnected world. It provides enhanced security for sensitive business data, reducing the risk of cyberattacks and breaches. By enabling secure communication channels, it ensures the confidentiality of information shared during collaborations and negotiations. In the financial industry, quantum encryption safeguards online transactions and prevents fraudulent activities. Implementing this technology demonstrates a commitment to customer privacy and enhances trust in businesses.

Quantum Computing for Advanced Data Analysis

Quantum Computing

Quantum computing offers unprecedented computational power for advanced data analysis. It enables businesses to process vast amounts of data quickly and efficiently, leading to valuable insights and informed decision-making. With its ability to handle complex algorithms and simulations, quantum computing opens up new possibilities in areas such as drug discovery, optimization, and machine learning. By harnessing this technology, businesses can stay at the forefront of innovation and gain a competitive edge in their respective industries.

Quantum sensing revolutionizes product development by providing ultra-sensitive measurements and detection capabilities. This technology enhances the accuracy and precision of various sensors, enabling the creation of more advanced and reliable products. By leveraging quantum sensing, businesses can improve the performance and functionality of their offerings, leading to greater customer satisfaction and market success. Additionally, quantum sensing enables the development of entirely new types of products that were previously impossible, opening up new market opportunities and driving innovation.

How Businesses Can Leverage Quantum Key Distribution

Quantum key distribution (QKD) is a cutting-edge technology that enables secure communication through the principles of quantum mechanics. By harnessing QKD, businesses can enhance their data encryption methods and protect sensitive information from potential cyber threats. This technology offers a highly secure and unbreakable method of transmitting cryptographic keys, ensuring that only authorized parties can access the data. By integrating QKD into their communication systems, businesses can safeguard their intellectual property, customer data, and confidential information, reinforcing trust and credibility with their clients. With QKD, businesses can confidently exchange critical data without the fear of interception or compromise, enabling secure collaborations and transactions in an increasingly digital world.

Understanding Quantum Key Distribution

Continuous-variable Quantum Key distribution is a state-of-the-art technology that utilizes the principles of quantum mechanics to enable secure communication. Implementing QKD allows businesses to strengthen their data encryption methods and protect sensitive information from potential cyber threats. This advanced technology provides a highly secure and unbreakable approach to transmitting cryptographic keys, ensuring that only authorized parties can access the data. By integrating Quantum Key distribution systems into their communication systems, businesses can safeguard their intellectual property, customer data, and confidential information, thereby reinforcing trust and credibility with their clients. With QKD, businesses can confidently exchange critical data without the fear of interception or compromise, facilitating secure collaborations and transactions in an increasingly digital world.

Implementing Quantum Key Distribution in Business Operations

Quantum key distribution (QKD) is a cutting-edge technology that utilizes the principles of quantum mechanics to enable secure communication. By implementing QKD, businesses can enhance their data encryption methods and protect valuable information from cyber threats. This advanced technology offers an extremely secure and unbreakable approach to transmitting cryptographic keys, ensuring that only authorized parties can access the data. Integrating QKD into communication systems allows businesses to safeguard their intellectual property, customer data, and confidential information, thereby strengthening trust and credibility with clients. With QKD, businesses can confidently exchange crucial data without fear of interception or compromise, enabling secure collaborations and transactions in today’s digital landscape.

In conclusion, Quantum Key Distribution (QKD) is a revolutionary technology that leverages the principles of quantum mechanics to provide highly secure communication. By adopting QKD, businesses can enhance data encryption methods and protect sensitive information from potential cyber threats. This advanced technology ensures that cryptographic keys are transmitted in a secure and unbreakable manner, allowing only authorized parties to access data. By integrating QKD into their communication systems, businesses can effectively safeguard intellectual property, customer data, and confidential information, fostering trust and credibility with clients. With QKD, businesses can confidently exchange critical data without the risk of interception or compromise, enabling secure collaborations and transactions in today’s digital era.

]]>
706
Black hole at the center of a galaxy in the early universe received less mass influx than expected https://hadamard.com/c/black-hole-at-the-center-of-a-galaxy-in-the-early-universe-received-less-mass-influx-than-expected/ https://hadamard.com/c/black-hole-at-the-center-of-a-galaxy-in-the-early-universe-received-less-mass-influx-than-expected/#respond Thu, 08 Feb 2024 18:36:21 +0000 https://hadamard.com/c/?p=622 Continue reading Black hole at the center of a galaxy in the early universe received less mass influx than expected]]>

A team of astronomers, led by the Max Planck Institute for Extraterrestrial Physics, has utilized the upgraded GRAVITY instrument at the Very Large Telescope Interferometer of the European Southern Observatory to determine the mass of a black hole in a galaxy only 2 billion years after the Big Bang. With a mass of 300 million solar masses, the black hole appears to be less massive than expected in comparison to its host galaxy. The researchers speculate on the reasons behind this discrepancy.

In the modern universe, astronomers have observed strong correlations between galaxy properties and the mass of the supermassive black holes residing at their centers, suggesting a co-evolutionary relationship. Examining this relationship in the early universe is challenging due to the distance of these galaxies and the limitations of traditional measurement methods.

The team’s previous breakthrough measurements with GRAVITY in 2018 focused on a nearby quasar. Now, they have extended their observations to a redshift of 2.3, corresponding to a lookback time of 11 billion years.

With GRAVITY+, astronomers can now study black hole growth during the critical epoch known as “cosmic noon,” when both galaxies and black holes were rapidly evolving. This advancement allows for imaging black holes in the early universe with a resolution 40 times sharper than that achievable with the James Webb Telescope.

By spatially resolving the motion of gas clouds around the central black hole in the galaxy SDSS J092034.17+065718.0, the team directly measured the black hole’s mass. They found it to be 320 million solar masses, smaller than expected compared to its host galaxy’s mass of about 60 billion solar masses. This indicates that the galaxy grew faster than its supermassive black hole, suggesting a delay in the growth of some systems.

The researchers propose that strong supernova feedback may have played a role in the evolution of this galaxy, expelling gas from its central regions before it could reach the black hole. Only once the galaxy became massive enough to retain a gas reservoir against supernova feedback could the black hole begin to grow rapidly and catch up with the galaxy’s overall growth.

Further high-precision mass measurements of black holes in the early universe are needed to determine if this scenario applies to other galaxies and their central black holes.

]]>
https://hadamard.com/c/black-hole-at-the-center-of-a-galaxy-in-the-early-universe-received-less-mass-influx-than-expected/feed/ 0 622
Breakthrough in Quantum Communication https://hadamard.com/c/breakthrough-in-quantum-communication/ https://hadamard.com/c/breakthrough-in-quantum-communication/#respond Sat, 03 Feb 2024 22:54:15 +0000 https://hadamard.com/c/?p=614 Continue reading Breakthrough in Quantum Communication]]> Researchers have achieved a groundbreaking feat in quantum transmission through a fiber optic cable, covering an unprecedented distance of 248 kilometers. This achievement heralds new possibilities for applications in advanced encryption methods.

While quantum transmissions have already been successfully demonstrated from La Palma to Tenerife and from Vienna to China, this latest milestone, connecting St. Pölten and Bratislava, sets a new world record despite its regional scope.

The prominent role of Nobel laureate Anton Zeilinger is evident once again, as researchers from the Institute for Quantum Optics and Quantum Information of the Austrian Academy of Sciences, which Zeilinger founded in 2003, spearheaded this endeavor. Their goal extended beyond distance records; they aimed to establish a crucial node in a burgeoning quantum data network called Quapital, envisioned to link Central Europe through quantum connections.

The results of this successful experiment have been disseminated in prestigious journals such as “Nature Communications” and “Quantum”. Over several days, a central transmitter in Vienna sent entangled light particles to St. Pölten and Bratislava. At the receiving ends, these particles were carefully detected to confirm their sustained entanglement. Remarkably, a transmission rate of nine entangled particles per second was maintained for an impressive duration of 110 hours.

The challenge in such transmissions lies in the inherent properties of quantum states, which do not allow for straightforward reading and amplification along the transmission path. Sebastian Neumann, the lead author of the publication, underscores this difficulty, pointing out that unlike conventional data lines, quantum states cannot be readily extracted and duplicated, a principle supported by the “No Cloning” theorem.

However, it is precisely this characteristic that makes quantum links so appealing. They offer a promising avenue for secure data transmission, particularly in contexts where confidentiality is paramount.

The unique nature of quantum physics plays a pivotal role in this endeavor, where observations have tangible consequences, making eavesdropping a significant concern. Quantum links leverage the concept of “entangled” particles, which maintain a mysterious connection over vast distances, as famously articulated by Einstein and later demonstrated by Nobel laureates Clauser and Aspect.

Rupert Ursin, the scientific leader of the project, highlights the significance of quantum entanglement in generating correlated randomness, akin to two coins tossed at different locations invariably landing on the same side. Such randomness serves as an ideal foundation for generating secure encryption keys.

Crucially, this achievement was realized using standard fiber optic cables, ubiquitous for high-speed data transfer worldwide. Despite the inherent sensitivity of quantum systems to disturbances, which often necessitate specialized equipment, fiber optic cables provided adequate shielding to maintain entanglement over extended distances.

With this record-breaking transmission, the prospect of a secure quantum internet moves closer to reality, promising transformative advances in secure communication.

]]>
https://hadamard.com/c/breakthrough-in-quantum-communication/feed/ 0 614
Albert Einstein: The Beauty of Relativity https://hadamard.com/c/albert-einstein-the-beauty-of-relativity/ https://hadamard.com/c/albert-einstein-the-beauty-of-relativity/#respond Wed, 31 Jan 2024 17:44:48 +0000 https://hadamard.com/c/?p=576 Continue reading Albert Einstein: The Beauty of Relativity]]>

Einstein’s brilliance wasn’t immediately evident in his early school days. His unconventional thinking and rebellious spirit often clashed with the rigid structure of the traditional education system. His teachers, unable to fathom the depth of his intellect, dismissed him as a dreamer. Yet, in the quiet corners of his mind, Einstein was unraveling the secrets of the cosmos.

His struggles in school were not solely due to a lack of intelligence but rather a clash of ideologies. Einstein believed in learning through exploration and imagination, an approach that set him apart from his peers. This rebellious nature, coupled with an unyielding passion for knowledge, propelled him forward despite the challenges he faced.

Einstein’s Journey to Polytechnic Institute in Zurich

As the confines of Ulm became too limiting for his boundless intellect, Einstein set his sights on the renowned Polytechnic Institute in Zurich. This decision marked a turning point in his life, as he embarked on a journey that would shape the course of scientific history.

Entering the Polytechnic Institute was not without its hurdles. Einstein had to prove his mettle in the entrance exams, showcasing his mastery of mathematics and physics. His determination prevailed, and in 1896, he found himself immersed in the vibrant academic atmosphere of Zurich. Here, under the guidance of brilliant professors, Einstein’s mind expanded further, delving into the intricacies of theoretical physics.

The Polytechnic Institute became the crucible in which Einstein’s scientific identity was forged. It was here that he honed his skills, questioned established theories, and cultivated the revolutionary ideas that would later blossom into his groundbreaking theories of relativity. The journey from Ulm to Zurich was not just a geographical transition but a transformative odyssey that laid the foundation for Einstein’s indelible mark on the scientific world.

Special Relativity: A Paradigm Shift in Physics

At the turn of the 20th century, the scientific world found itself grappling with a profound mystery. The Michelson-Morley experiment, conducted in 1887 to detect the Earth’s motion through the luminiferous ether, produced perplexing results. The anticipated changes in light speed, based on Earth’s presumed motion through space, were nowhere to be found. This experimental anomaly threw the established framework of classical physics into disarray, heralding a crisis that demanded a visionary mind to unravel.

Albert Einstein, already immersed in his exploration of theoretical physics, seized upon this enigma. In 1905, he published his groundbreaking paper on Special Relativity, challenging the very fabric of Newtonian mechanics. Einstein posited that the speed of light is constant for all observers, regardless of their motion relative to the light source. This bold proposition shattered the conventional understanding of space and time, opening the door to a new era in physics.

Unveiling the Equivalence of Energy and Mass

E = m * c²

Mass Energy equivalence

The equation E=mc², arguably the most famous formula in the realm of physics, emerged as a consequence of Einstein’s Special Relativity. In a stroke of unparalleled insight, Einstein unveiled the equivalence of energy (E) and mass (m), forever altering our perception of the fundamental building blocks of the universe.

This iconic equation implies that mass and energy are interchangeable and interconnected on a profound level. It illuminated the potential for converting matter into energy and vice versa, laying the groundwork for revolutionary advancements in nuclear physics. The atomic bomb, nuclear energy, and myriad technological marvels owe their existence to Einstein’s revelation of the inherent unity of mass and energy.

The Concept of Spacetime: Redefining Space and Time

Einstein’s Special Relativity didn’t just reshape our understanding of motion and energy; it introduced the concept of spacetime. Instead of treating space and time as separate entities, Einstein proposed an elegant fusion that encapsulated both in a four-dimensional continuum.

According to this novel perspective, gravity isn’t merely a force acting at a distance, as envisioned by Newton. Instead, massive objects bend the fabric of spacetime, influencing the paths of other objects nearby. This conceptual shift explained gravitational phenomena with unparalleled precision, from the orbits of planets to the bending of light around massive celestial bodies.

In essence, Einstein’s Special Relativity was a seismic shift in the foundations of physics, ushering in a new era of understanding where time and space became intertwined, mass and energy became interchangeable, and the mysteries of the cosmos were approached with a fresh perspective. The humble beginnings in Ulm had led to an intellectual journey that forever altered the trajectory of scientific thought.

Gravitational Waves: Predictions and Observations

Einstein’s journey into the cosmos didn’t end with Special Relativity; it reached its zenith with the development of General Relativity in 1915. One of the most intriguing predictions of this new theory was the existence of gravitational waves—ripples in spacetime caused by the acceleration of massive objects. At the time, these waves were considered a theoretical curiosity, seemingly impossible to detect directly.

Decades later, however, advancements in technology and the construction of sophisticated instruments, such as LIGO (Laser Interferometer Gravitational-Wave Observatory), allowed scientists to observe the elusive gravitational waves for the first time in 2015. This monumental discovery not only validated Einstein’s predictions but also opened a new era in astrophysics, providing a novel tool to explore the cosmos and detect cataclysmic events like the collision of black holes and neutron stars.

Einstein’s Encounter with the Eclipse of 1919

In 1919, a celestial event presented an opportunity to test the predictions of General Relativity. A total solar eclipse was set to occur, and Einstein’s theory posited that massive objects, such as the sun, could bend the path of light. Sir Arthur Eddington, a British astrophysicist, led an expedition to Principe and Sobral to observe the eclipse and measure the deflection of starlight passing near the sun.

The results of these observations, announced in 1919, were nothing short of revolutionary. The bending of starlight during the eclipse matched Einstein’s predictions, providing empirical evidence for the validity of General Relativity. The world was astounded, and Einstein ascended to scientific celebrity, his theories vindicated on a global stage.

The Mathematics of Curved Spacetime: A Triumph of Theoretical Physics

Einstein’s General Relativity introduced a profound shift in the understanding of gravity. Instead of the traditional gravitational force acting across space, Einstein proposed that massive objects curve the fabric of spacetime itself. The mathematics underlying this concept involved intricate equations that described the curvature caused by mass and its effect on the motion of objects.

Einstein’s field equations, as they came to be known, were a triumph of theoretical physics. The complexity of these equations presented a formidable challenge, requiring a deep understanding of mathematics and physics. Yet, they laid the groundwork for explaining gravitational phenomena on cosmic scales, from the behavior of planets to the expansion of the universe.

In summary, General Relativity not only revolutionized our understanding of gravity but also paved the way for the exploration of the cosmos through gravitational waves. The eclipse of 1919 served as a pivotal moment, confirming the validity of Einstein’s theories and propelling him to iconic status. The mathematical elegance of curved spacetime remains a testament to the power of human intellect in unraveling the mysteries of the universe.

Einstein’s Enduring Influence on Physics

Albert Einstein’s impact on the field of physics is nothing short of revolutionary and enduring. His contributions to both Special and General Relativity reshaped the landscape of theoretical physics, providing a new framework for understanding the fundamental forces governing the universe. The equations he formulated not only explained previously unexplained phenomena but also predicted new ones, guiding the direction of scientific inquiry for generations to come.

Einstein’s work has become a cornerstone of modern physics, influencing diverse areas such as cosmology, quantum mechanics, and astrophysics. The application of his theories has led to practical technologies like GPS, and his ideas continue to inspire groundbreaking research in the quest to comprehend the mysteries of the cosmos.

Beyond his profound contributions to theoretical physics, Einstein played a pivotal role in popularizing science for the general public. Known for his ability to distill complex ideas into accessible concepts, he made the wonders of the universe understandable to a broader audience. Through his writings, public lectures, and interviews, Einstein demystified scientific principles, fostering a greater appreciation for the beauty and elegance of the natural world.

Einstein’s knack for communication transcended the academic realm, reaching people from all walks of life. He believed in the democratization of knowledge and sought to bridge the gap between the scientific community and the general public. His efforts laid the groundwork for a broader societal engagement with science, influencing subsequent generations of scientists and science communicators.

Einstein’s impact extends far beyond the scientific realm; he became a cultural icon and a symbol of intellectual prowess. His distinctive image, with wild hair and thoughtful expression, has become synonymous with genius. Einstein’s name is often invoked in discussions on creativity, curiosity, and the pursuit of knowledge.

Throughout the 20th century and beyond, Einstein’s life and persona inspired literature, art, and popular culture. His quotes are widely cited, and his name is used as shorthand for brilliance. The enduring fascination with Einstein transcends his scientific achievements, encompassing his advocacy for social justice, pacifism, and humanism.

In conclusion, Albert Einstein’s legacy is multi-faceted. His enduring influence on physics has shaped the trajectory of scientific inquiry, while his commitment to making science accessible has ignited curiosity and interest in the broader public. Einstein’s cultural impact and iconic status make him a symbol of intellectual curiosity, leaving an indelible mark on the collective consciousness of humanity.

]]>
https://hadamard.com/c/albert-einstein-the-beauty-of-relativity/feed/ 0 576
Isaac Newton: Birth of the Modern Mind https://hadamard.com/c/isaac-newton-birth-of-the-modern-mind/ https://hadamard.com/c/isaac-newton-birth-of-the-modern-mind/#respond Thu, 25 Jan 2024 17:37:51 +0000 https://hadamard.com/c/?p=557 Continue reading Isaac Newton: Birth of the Modern Mind]]>

Sir Isaac Newton, one of the foremost scientific minds in history, was born on January 4, 1643, in Woolsthorpe, Lincolnshire, England. His childhood was marked by a curious mind and an innate interest in the natural world. Newton’s early education at the King’s School in Grantham laid the foundation for his future intellectual pursuits. It was during these formative years that he demonstrated an exceptional aptitude for mathematics and physics.

Newton’s thirst for knowledge extended beyond the conventional curriculum. His inquisitive nature led him to explore the works of ancient philosophers, such as Aristotle and Euclid, laying the groundwork for his later groundbreaking contributions to science. Despite facing financial challenges, Newton’s academic brilliance caught the attention of university officials, eventually earning him a place at Trinity College, Cambridge, in 1661.

As a student, Newton delved into various branches of mathematics, optics, and astronomy. His groundbreaking work on calculus, developed independently of Leibniz, revolutionized the field of mathematics and provided a powerful tool for describing the motion of objects. The seeds of his future discoveries were sown during his early years, setting the stage for the scientific revolution that would unfold in the years to come.

Genesis of the Principia: Newton’s Groundbreaking Work

The Principia Mathematica, published in 1687, stands as a monumental achievement in the annals of scientific literature. Newton’s journey to this magnum opus began with a confluence of intellectual curiosity and methodical brilliance. The genesis of the Principia can be traced back to Newton’s intense period of intellectual ferment during the mid-1660s, often referred to as his “miraculous year.”

Driven by a relentless pursuit of understanding the fundamental laws governing the physical world, Newton explored the realms of mathematics, optics, and celestial mechanics. The culmination of his efforts was the groundbreaking development of calculus, a mathematical framework that provided him with the tools to articulate his revolutionary ideas. Newton’s manuscript “De Analysi” laid the groundwork for the Principia, setting the stage for the synthesis of disparate threads into a unified theory of mechanics and gravitation.

Unraveling the Laws of Motion

F = m * a

Newton’s second law of motion

At the core of the Principia Mathematica are Newton’s three laws of motion, which transformed our understanding of how objects move and interact. The first law states that an object at rest will remain at rest, and an object in motion will stay in motion unless acted upon by an external force. The second law introduces the concept of force, stating that the force acting on an object is equal to the mass of the object multiplied by its acceleration. The third law asserts that for every action, there is an equal and opposite reaction.

These laws formed the bedrock of classical mechanics, providing a systematic framework for describing the motion of objects on Earth and beyond. The elegance and simplicity of Newton’s laws marked a departure from previous explanations, ushering in a new era of scientific inquiry. The precision and generality of these laws enabled scientists to predict and understand the motion of celestial bodies, laying the groundwork for the subsequent advancements in physics.

Universal Gravitation: Connecting the Cosmos

Newton’s crowning achievement within the Principia was the formulation of the law of universal gravitation. Building upon the astronomical observations of Kepler and the conceptual groundwork laid by Galileo, Newton proposed that every particle of matter in the universe attracts every other particle with a force directly proportional to the product of their masses and inversely proportional to the square of the distance between their centers.

This law not only explained the motion of celestial bodies but also provided a unifying framework for understanding the forces at play both on Earth and in the heavens. The law of universal gravitation demonstrated that the same principles govern the fall of an apple and the orbits of planets.

Unraveling the Mystical Mind: Alchemical Works

Newton’s interest in alchemy was not a mere dalliance; rather, it was a serious and sustained pursuit that spanned several decades of his life. His alchemical manuscripts, totaling in the thousands of pages, showcase a complex interplay of symbolism, obscure language, and cryptic diagrams. Newton’s alchemical endeavors were not disconnected from his scientific work; instead, they reflected a holistic approach to understanding the natural world.

In these mystical pursuits, Newton sought not only the transformation of base metals into gold but also a deeper understanding of the hidden forces governing the cosmos. The allegorical language of alchemy allowed him to express ideas about the nature of matter, the interconnectedness of elements, and the quest for spiritual enlightenment. For Newton, alchemy was a symbolic language that complemented his mathematical and scientific endeavors.

The intertwining of alchemy with Newton’s scientific pursuits raises intriguing questions about the interconnectedness of seemingly disparate fields. While his alchemical works did not contribute directly to the advancements in physics, they offer a unique window into the mind of a genius who saw the pursuit of knowledge as a seamless tapestry, weaving together the empirical and the mystical.

In conclusion, Newton’s ventures into alchemy and mysticism underscore the complexity of his intellectual character. Beyond the well-known achievements in mathematics and physics, his exploration of the arcane arts reveals a man driven by an insatiable curiosity that transcended the boundaries of traditional scholarship. Newton’s legacy extends beyond the Principia Mathematica, encompassing a broader spectrum of human inquiry that reflects the intricate tapestry of his intellectual pursuits.

The Newtonian Revolution: Transforming Science

The Newtonian Revolution stands as a pivotal chapter in the history of science, reshaping the way humanity perceives and investigates the natural world. Sir Isaac Newton’s profound contributions to physics and mathematics during the 17th century marked a paradigm shift that laid the groundwork for the modern scientific method. Newton’s magnum opus, the Principia Mathematica, not only provided a comprehensive framework for understanding the laws of motion and universal gravitation but also established a new standard for scientific inquiry.

At the heart of the Newtonian Revolution was a commitment to empiricism and mathematical precision. Newton’s laws of motion and the law of universal gravitation were not speculative theories but elegant expressions of natural phenomena grounded in rigorous observation and mathematical formulation. This emphasis on empirical evidence and mathematical rigor became the hallmark of the scientific method, guiding future generations of scientists in their quest for understanding.

Newton’s methodology encompassed the systematic observation of the physical world, the formulation of mathematical models to describe observed phenomena, and the testing of these models through experimentation. This approach represented a departure from the speculative reasoning prevalent in earlier scientific traditions, establishing a foundation upon which the scientific method would evolve and flourish.

Influence on Subsequent Generations: Newtonian Legacy

Newton’s legacy extends far beyond his own lifetime, influencing subsequent generations of scientists and thinkers. The Newtonian paradigm became a cornerstone of the Enlightenment, a period characterized by a commitment to reason, empirical observation, and the pursuit of knowledge. The scientific method, as exemplified by Newton, became a template for inquiry in various disciplines, fostering a culture of systematic investigation and intellectual rigor.

The impact of Newton’s ideas resonated across diverse fields, from physics and astronomy to philosophy and beyond. His principles not only advanced our understanding of the physical universe but also inspired a broader societal shift towards valuing evidence-based reasoning. Newton’s legacy became intertwined with the very fabric of modernity, shaping the intellectual landscape of the centuries that followed.

In the realm of physics, Newton’s laws provided a robust foundation upon which subsequent theories, such as those of Albert Einstein, would build. The concept of universal gravitation, while later refined, remains a fundamental part of our understanding of celestial mechanics.

Beyond the scientific realm, Newton’s influence extended into philosophy, where his emphasis on empirical observation and the scientific method laid the groundwork for the empiricist tradition. The Enlightenment ideals of reason, progress, and the pursuit of knowledge owe a debt to Newton’s intellectual legacy.

In conclusion, Newton’s legacy is not confined to equations and theorems; it permeates the very fabric of the scientific method and the broader intellectual ethos of modern civilization. The Newtonian Revolution transformed science from a speculative pursuit into a disciplined, evidence-based inquiry, shaping the trajectory of human understanding and paving the way for the remarkable scientific advancements of subsequent centuries.

]]>
https://hadamard.com/c/isaac-newton-birth-of-the-modern-mind/feed/ 0 557
Exploring the Legacy of Jacques Hadamard: A Mathematician Ahead of His Time https://hadamard.com/c/exploring-the-legacy-of-jacques-hadamard-a-mathematician-ahead-of-his-time/ https://hadamard.com/c/exploring-the-legacy-of-jacques-hadamard-a-mathematician-ahead-of-his-time/#respond Sun, 07 Jan 2024 01:32:24 +0000 https://hadamard.com/c/?p=370 Continue reading Exploring the Legacy of Jacques Hadamard: A Mathematician Ahead of His Time]]> Jacques Hadamard, a French mathematician born on December 8, 1865, left an indelible mark on the world of mathematics with his groundbreaking contributions. Known for his work in number theory, complex analysis, and partial differential equations, Hadamard’s intellectual prowess extended beyond his time, influencing generations of mathematicians. This article delves into the life, work, and enduring legacy of Jacques Hadamard.

Early Life and Education:

Hadamard’s journey into the world of mathematics began at an early age. Born in Versailles, France, he displayed exceptional mathematical talent during his formative years. He attended the École Normale Supérieure in Paris, where he studied under renowned mathematicians such as Henri Poincaré and Charles Hermite.

Contributions to Number Theory:

One of Hadamard’s most celebrated achievements was his work in number theory. In collaboration with Charles de la Vallée-Poussin, he proved the prime number theorem in 1896. This theorem describes the distribution of prime numbers and is considered one of the most important results in the field of number theory.

Hadamard’s approach to the prime number theorem was novel, using techniques from complex analysis. His work not only provided a solution to a long-standing mathematical problem but also paved the way for further developments in the understanding of the distribution of prime numbers.

Hadamard Transform and Matrix Theory:

Jacques Hadamard’s contributions extended to the realm of linear algebra with the development of the Hadamard transform. This mathematical operation involves transforming a sequence of numbers into another sequence, offering applications in signal processing, error correction, and cryptography. The Hadamard transform has become a fundamental tool in various fields, demonstrating the practical implications of Hadamard’s theoretical insights.

In addition to his work on the Hadamard transform, Jacques made significant contributions to matrix theory. His research in this area laid the groundwork for the study of matrices, influencing subsequent developments in linear algebra.

Hadamard’s Impact on Functional Analysis and Partial Differential Equations:

Hadamard’s mathematical prowess was not confined to number theory and linear algebra. His work in functional analysis and partial differential equations significantly advanced these fields. His investigations into the behavior of solutions to partial differential equations broadened the understanding of the fundamental principles governing these mathematical entities.

Later Life and Legacy:

Jacques Hadamard continued to contribute to mathematics throughout his career, receiving numerous accolades and honors, including the prestigious Bolyai Prize in 1912 and election to the French Academy of Sciences in 1916. He also held the position of professor at the Collège de France from 1920 to 1937.

Hadamard’s legacy endures through his extensive body of work and the impact he had on subsequent generations of mathematicians. His influence extends beyond his specific theorems and formulas, shaping the way mathematicians approach problems and inspiring further exploration into the depths of mathematical theory.

Conclusion:

Jacques Hadamard’s brilliance as a mathematician left an indelible mark on the field, with his contributions spanning number theory, linear algebra, functional analysis, and partial differential equations. His innovative thinking and groundbreaking results continue to shape the landscape of mathematics, making him a revered figure among mathematicians worldwide. As we celebrate the legacy of Jacques Hadamard, we acknowledge the enduring impact of his work on the evolution of mathematical thought.

]]>
https://hadamard.com/c/exploring-the-legacy-of-jacques-hadamard-a-mathematician-ahead-of-his-time/feed/ 0 370
E91 Protocol: Harnessing Entanglement for Secure Communication https://hadamard.com/c/e91-protocol-harnessing-entanglement-for-secure-communication/ https://hadamard.com/c/e91-protocol-harnessing-entanglement-for-secure-communication/#respond Fri, 29 Dec 2023 04:39:36 +0000 https://hadamard.com/news/?p=34 Continue reading E91 Protocol: Harnessing Entanglement for Secure Communication]]> In the rapidly evolving landscape of quantum information science, the pursuit of secure communication has led to groundbreaking developments, one of which is the E91 protocol. Named after its inventors Artur Ekert, Anton Zeilinger, and Alain Aspect, the E91 protocol is a quantum key distribution (QKD) scheme that exploits the unique phenomenon of quantum entanglement to establish secure communication channels.

Understanding Quantum Entanglement

Quantum entanglement is a phenomenon that occurs when two or more quantum particles become correlated in such a way that the state of one particle is directly related to the state of the other, regardless of the physical distance between them. This intriguing property forms the basis of the E91 protocol, allowing for the creation of secure cryptographic keys.

What are the Principles of the E91 Protocol?

The E91 protocol is based on the concept of Bell’s theorem, which asserts that certain correlations between quantum particles cannot be explained by classical physics. In the case of the E91 protocol, pairs of entangled particles are distributed between two distant parties, traditionally named Alice and Bob. The protocol involves the following key steps:

Entanglement Generation:

Alice and Bob each generate a pair of entangled particles, typically photons.

These particles are prepared in a quantum state known as a Bell state, where the properties of one particle are immediately correlated with the properties of the other.

Particle Measurement:

After generating entangled pairs, Alice and Bob independently choose random measurement settings to observe specific properties of their particles.

Correlation Check:

Alice and Bob communicate their chosen measurement settings but keep the results secret.

By comparing a subset of their bit strings, they check for the violation of Bell inequalities, indicating the presence of quantum entanglement.

Bit Selection:

Based on the outcomes of the correlation check, Alice and Bob discard certain bits from their strings, ensuring the selection of only those bits that exhibit quantum entanglement.

Key Extraction:

The remaining bits are then used as a shared secret key between Alice and Bob for secure communication.

What are the Challenges and Future Directions?

While the E91 protocol represents a significant advancement in quantum cryptography, challenges such as practical implementation issues and the need for high-efficiency entangled photon sources still exist. Researchers are actively working to address these challenges and explore the potential of the E91 protocol in real-world applications.

]]>
https://hadamard.com/c/e91-protocol-harnessing-entanglement-for-secure-communication/feed/ 0 34
BB84 Protocol: Quantum Key Distribution for Unbreakable Communication https://hadamard.com/c/bb84-protocol-quantum-key-distribution-for-unbreakable-communication/ https://hadamard.com/c/bb84-protocol-quantum-key-distribution-for-unbreakable-communication/#respond Thu, 28 Dec 2023 17:49:19 +0000 https://hadamard.com/news/?p=32 Continue reading BB84 Protocol: Quantum Key Distribution for Unbreakable Communication]]> In the realm of secure communication, the BB84 protocol stands out as a groundbreaking achievement in the field of quantum key distribution (QKD). Proposed by Charles Bennett and Gilles Brassard in 1984, the BB84 protocol leverages the principles of quantum mechanics to enable the exchange of cryptographic keys in a manner that is theoretically immune to eavesdropping. This innovation addresses a fundamental challenge in secure communication: how to establish a secret key between two parties in a way that guarantees its secrecy against the most sophisticated adversaries.

Understanding Quantum Key Distribution

Classical cryptographic systems rely on mathematical algorithms for securing communication, but their security is fundamentally based on the complexity of mathematical problems. With the advent of quantum computing, the potential for breaking these cryptographic systems has increased, prompting the need for alternative approaches.

Quantum key distribution, or QKD, exploits the principles of quantum mechanics to enable two parties, traditionally named Alice and Bob, to exchange a secret key with the assurance that any eavesdropping attempt will be detectable. The BB84 protocol is one of the earliest and most well-known QKD protocols.

What are key Principles of the BB84 Protocol?

Quantum Superposition:

The BB84 protocol exploits the property of quantum superposition, where a quantum bit (qubit) can exist in multiple states simultaneously.

Alice sends a stream of qubits to Bob, each qubit being in one of two possible bases, typically represented by the rectilinear basis (0° and 90°) and the diagonal basis (45° and 135°).

Uncertainty Principle:

The Heisenberg Uncertainty Principle plays a crucial role in BB84. When Bob measures a qubit, the act of measurement disturbs the quantum state, providing a means for Alice and Bob to detect eavesdropping.

Random Basis Choice:

Alice randomly chooses the basis for each qubit she sends to Bob. This ensures that the eavesdropper, traditionally named Eve, cannot predict the basis with certainty.

Public Communication:

After the quantum communication phase, Alice and Bob publicly compare a subset of their respective measurement bases.

They discard the qubits where their bases do not match, thus revealing only the bits that were measured in the same basis.

The BB84 protocol represents a milestone in the quest for secure communication in the face of advancing technologies that threaten classical cryptographic systems. By harnessing the unique properties of quantum mechanics, the protocol provides a framework for exchanging cryptographic keys with unprecedented security guarantees. As quantum technologies continue to evolve, so too will the field of quantum key distribution, with the BB84 protocol serving as a foundational building block for future innovations in secure communication.

]]>
https://hadamard.com/c/bb84-protocol-quantum-key-distribution-for-unbreakable-communication/feed/ 0 32
D-Wave Quantum Applauds Inclusion of Quantum Pilot Program in U.S. National Defense Authorization Act https://hadamard.com/c/d-wave-quantum-applauds-inclusion-of-quantum-pilot-program-in-u-s-national-defense-authorization-act/ https://hadamard.com/c/d-wave-quantum-applauds-inclusion-of-quantum-pilot-program-in-u-s-national-defense-authorization-act/#respond Thu, 28 Dec 2023 17:00:59 +0000 https://hadamard.com/news/?p=30 Continue reading D-Wave Quantum Applauds Inclusion of Quantum Pilot Program in U.S. National Defense Authorization Act]]> D-Wave Quantum Inc., a company in quantum computing, has commended U.S. policymakers for incorporating a quantum pilot program into the National Defense Authorization Act (NDAA). This landmark legislation, signed into law by President Biden on December 22, 2023, establishes a program focused on near-term quantum computing applications to address challenges faced by the Department of Defense (DOD) and the Armed Forces. D-Wave, known for its quantum computing systems, software, and services, applauds the bipartisan effort, emphasizing the importance of leveraging quantum technologies to enhance national security.

The quantum pilot program outlined in the NDAA aims to test, evaluate, and implement quantum and quantum-hybrid applications to address pressing issues in defense, military, and national security operations. Notably, the program encourages the use of annealing quantum computing and quantum-hybrid technologies. It also emphasizes the need to build and strengthen collaborations between the DOD, academic institutions, small businesses, and non-traditional defense contractors.

D-Wave, as the world’s first commercial supplier of quantum computers, is well-positioned to contribute to the program. The company has already developed demonstrations, proof of concepts (POCs), and applications for various governments and businesses globally. Examples include optimizing cargo pier operations at the Port of Los Angeles, last-mile resupply applications for the Australian Army, and initiatives for carbon emission reduction and tsunami evacuation route optimization in Japan.

Dr. Alan Baratz, CEO of D-Wave, expressed appreciation for Congress and the Administration, stating that the quantum pilot program accelerates the use of quantum solutions for critical challenges faced by the DOD and the military. He highlighted the bipartisan nature of this action and emphasized D-Wave’s readiness to collaborate with the government to advance the program.

The inclusion of a quantum pilot program in the NDAA marks a pivotal moment in the integration of quantum computing into U.S. national security efforts. D-Wave Quantum Inc.’s commendation reflects the commitment of the quantum industry to collaborate with government agencies in addressing complex challenges. As the program unfolds, it is anticipated to usher in a new era of technological innovation and strategic solutions for the defense and security landscape.

]]>
https://hadamard.com/c/d-wave-quantum-applauds-inclusion-of-quantum-pilot-program-in-u-s-national-defense-authorization-act/feed/ 0 30
Is optical computing the future? https://hadamard.com/c/is-optical-computing-the-future/ https://hadamard.com/c/is-optical-computing-the-future/#respond Thu, 28 Dec 2023 16:42:54 +0000 https://hadamard.com/news/?p=26 Continue reading Is optical computing the future?]]> To directly answer the question of what optical computing is?

Optical computing refers to the use of light or photons to perform computation instead of traditional electronic means. It holds promise for faster and more efficient processing.

How can one build logic gates and complex circuits using light? To achieve any logic gate, such as an adder, we need a NAND or NOR gate. From these gates, any desired gate can be constructed. A NOR gate can be implemented by utilizing the natural OR structure of light, where if a single signal is on, the output is automatically on, and only when both input signals are off, the output signal is off. The challenge lies in incorporating a NOT gate within the NOR gate, which reverses the signal, turning it from on to off and vice versa.

To obtain the logical component, the patterns of optical interferences of light waves are now utilized.

How an improvement in internal data speed by a factor of 1000x compared to conventional computers can be achieved: The difference lies in the use of photons instead of electrons, enabling nearly instant computation. In contrast to traditional computers, data flows continuously during a computing process without the need to stop or redirect it.

As our communication today is largely optical over terrestrial networks, every time we send data over fiber cables, we must convert electronic signals to optical signals using optical computing, what involves challenges. Shrinking optical, logical circuits to a microscopic level for computer-like processing poses greater complexity compared to silicon-based processors. Despite these challenges, optical computing remains an intriguing area for the future of computers, especially in communication applications, where the conversion between electronic and optical signals can be cumbersome and resource-intensive.

]]>
https://hadamard.com/c/is-optical-computing-the-future/feed/ 0 26
Is a quantum era even possible? https://hadamard.com/c/is-a-quantum-era-even-possible/ https://hadamard.com/c/is-a-quantum-era-even-possible/#respond Thu, 28 Dec 2023 15:26:43 +0000 https://hadamard.com/news/?p=23 Continue reading Is a quantum era even possible?]]> In times when the gaps between nodes from TSMC or Samsung are growing larger and the performance or efficiency gains are becoming increasingly marginal, many are not asking whether quantum computers will replace the dying Moore’s Law but only when. But is this technically possible to entangle enough qubits to satisfy our constant hunger for computing power?

In 2023, the leading Taiwanese semiconductor producer TSMC rolled out the 3nm node, which is used, among other things, in the A16 Bionic Chip of the iPhone 15. This new structure density allows for a 1.6x increase in logic density and a 30-35% power reduction. While this may sound remarkable, it represents a gap of three years between this structure density and the 5nm structure density. The common consensus of Moore’s Law states that computing power or efficiency should double every 24 months. It should be clear by now that in the future, it will no longer be cost-effective to continuously shrink transistors. In the realm of structure reduction, only TSMC and Samsung continue to play a significant role.

Our deterministic computing is inevitably reaching a fundamental limit because some components on our chips are now only a few silicon atoms in size.

But what does the future look like?

Quantum computing is still in an early and experimental stage. To perform a quantum search, like the Grover algorithm, which generates quadratic acceleration, one can double the performance with each entangled qubit. However, with each qubit, one also encounters a more unstable system, as the trapped ions are very sensitive to environmental influences.

So, will we be able to entangle thousands, if not millions, of qubits in the near future, stable and error-free? Probably yes, but just as the performance doubles with each qubit, the complexity of executing this system stably and error-free also increases.

When looking back in time, we can draw parallels to the challenges Intel faced in developing the first microprocessor. In 1971, Italian engineer Federico Faggin, American engineers Marcian Hoff and Stanley Mazor, and Japanese engineer Masatoshi Shima pushed the limits of what was technically possible to bring the first commercial microprocessor to market. At that time, it seemed inconceivable, even physically impossible, to one day pack billions of microscopic transistors onto a chip and produce them in large quantities at an affordable price. Only time will tell by the end of this decade whether and when a similar development will occur in quantum computing.

]]>
https://hadamard.com/c/is-a-quantum-era-even-possible/feed/ 0 23
Quantum science and technology: highlights of 2023 https://hadamard.com/c/quantum-science-and-technology-highlights-of-2023/ https://hadamard.com/c/quantum-science-and-technology-highlights-of-2023/#respond Thu, 28 Dec 2023 11:00:31 +0000 https://physicsworld.com/?p=111900 Continue reading Quantum science and technology: highlights of 2023]]> It’s been another banner year for quantum science and technology, with academic research groups and tech firms celebrating significant achievements in quantum computing, quantum communications and quantum metrology as well as fundamental quantum science. Three of these advances – a quantum repeater that transmits quantum information over a distance of 50 km; a double-slit experiment in time; and a simulation of an expanding universe in a Bose-Einstein condensate – appeared in our list of the year’s top 10 breakthroughs, but with so many exciting things going on, we can’t resist celebrating a few others. Here, in no particular order, are some highlights.

nn

Joining the hardware dots

n

Some innovations hit the headlines right away. Others lay the groundwork for future breakthroughs. In May, Johannes Fink and colleagues at the Institute for Science and Technology Austria claimed a place in the second group by demonstrating a protocol for entangling microwave and optical photons. This is important because the superconducting circuits that make up many of today’s most advanced quantum computers operate at microwave frequencies, but the fibres and other equipment used to send information over long distances work at optical frequencies. If we want to build a network of many quantum computers and make them talk to each other, we will therefore need strong, reliably quantum connections between these two frequencies.

n

Now that Fink and his team have shown that such connections are possible, prospects for quantum networks based on superconducting qubits look rosier, though the protocol still needs refining. As one independent expert observed, “We should not think that this makes everything easy now – it’s just the beginning, but that doesn’t take away from the quality of the experiment.”

n

Photo of the photonic integrated circuit

n

A similarly slow-burn advance occurred in August when researchers in John Bowers’ group at the University of California, Santa Barbara, put a laser and a photonic waveguide on the same chip for the first time. Integrated photonic systems like these will be crucial to scaling up quantum computers based on trapped ions or atoms, but lasers and waveguides haven’t always played well together. Specifically, when light from a laser enters a waveguide, some of it gets reflected, and if this reflected light gets back to the laser, the laser’s output becomes unstable. By designing a chip that avoids these unwanted interactions, Bowers and colleagues made the job of future quantum hardware designers much easier.

n

Milestones in quantum metrology

n

In the year the first commercial optical atomic clock went on sale, quantum metrologists also notched up an achievement at the other end of the technology readiness scale. Just as optical clocks are more precise than their microwave-frequency predecessors, clocks that “tick” every time an atom’s nucleus undergoes an energy transition would be more precise still. They might even be precise enough to catch fundamental constants in the act of fluctuating on very short time scales, which would violate the Standard Model of particle physics.

n

ISOLDE at CERN

n

The problem is that no-one knows the frequencies of these nuclear transitions well enough to drive them with a laser. In June, though, physicists at CERN got closer to finding out when they detected a photon emitted by a thorium-229 ion as it returned to its nuclear ground state. Though much work remains to be done, the result is nonetheless a step towards the next generation of ultra-precise timekeeping.

n

Meanwhile, physicists at the University of Colorado, Boulder, US, put down a marker in their quest to measure the electron electric dipole moment (eEDM) to ever-greater precision. A non-zero value of this quantity would violate the Standard Model, and in August, a team led by Jun Ye and Eric Cornell announced that the eEDM must be less than 4.1 x 10-30 e cm, with an uncertainty of 2.1×10-30 – a precision equivalent to measuring the Earth to within the dimensions of a virus.

n

The emergence of effective quantum error correction

n

An artist's drawing of five spheres in a line. The spheres represent atoms; four of the atoms are yellow, while one of them glows pink

n

Errors are the bane of quantum computers, and demonstrating ways to correct them is a major goal of quantum computing research. In 2023, these efforts started to pay off. In February, researchers at Google Quantum AI announced that they had suppressed errors in their superconducting-qubit device by implementing a surface code. This type of quantum error-correcting code encodes a logical (that is, error-corrected) qubit in the joint entangled state of many physical qubits. The following month, a team at Yale University in the US demonstrated a different approach to the same problem, using a qubit encoding called a GKP code to suppress errors with the help of additional information embedded in superconducting transmon qubits.

nn

Arguably the year’s most impressive error-correction result, however, came just a few weeks ago, when Mikhail Lukin and colleagues at Harvard University, QuEra Computing, the Massachusetts Institute of Technology and the NIST/University of Maryland Joint Center for Quantum Information and Computer Science reported that they had created an array of 48 logical qubits using neutral atoms.

n

Even before this announcement, 2023 was looking like a breakout year for neutral-atom quantum computers, which are having a moment after a long period of trailing behind devices that use superconducting circuits or trapped ions as qubits. Will 2024 be the year they leap ahead? Or will their rivals find new advantages to exploit? Watch this space!

n

The best of the rest

n

Finally, a few of 2023’s quantum achievements stand out for their sheer ingenuity. This year saw the first observation of quantum superchemistry, which occurs when chemical reactions speed up because the reacting molecules are all in the same quantum state. It also marked the first time anyone had spotted quantum entanglement in top quarks, which have a lifetime of just 10-25 seconds. The most ingenious quantum result of the year, though, is surely the demonstration of an engine that runs on the energy difference between bosons and fermions. As an example of the links between classical and quantum physics, it could hardly be better.

n

 

n

n

Reports on Progress in Physics logo

n

Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.

n

 

n

 

n

n

n

The post Quantum science and technology: highlights of 2023 appeared first on Physics World.

n

]]>
https://hadamard.com/c/quantum-science-and-technology-highlights-of-2023/feed/ 0 37
Medical physics and biotechnology: highlights of 2023 https://hadamard.com/c/medical-physics-and-biotechnology-highlights-of-2023/ https://hadamard.com/c/medical-physics-and-biotechnology-highlights-of-2023/#respond Wed, 27 Dec 2023 10:00:04 +0000 https://physicsworld.com/?p=111911 Continue reading Medical physics and biotechnology: highlights of 2023]]> This year, the Physics World team selected a medical innovation as the Breakthrough of the Year: the development of a digital bridge that restores communication between the brain and spinal cord, enabling a man with paralysis to stand and walk naturally. We also reported on several other neural engineering advances, including a neuroprosthesis that restores communication to those who cannot speak and an award-winning implant that could help regulate blood pressure in people with spinal-cord injuries.

n

And that’s just one example of the impact of physics-related research on the healthcare sector. In 2023, we wrote about a host of medical physics and biotechnology advances, from photon-counting detectors that produce high-quality images with less contrast media to hydrogels that help grow new brain tissue to shoot-through FLASH proton therapy. Here are a few more highlights that caught our eye.

n

Novel takes on nuclear medicine

n

Among the many developments in positron emission tomography (PET) technology announced this year, a research team headed up at Memorial Sloan Kettering Cancer Center and Complutense University of Madrid devised a novel image reconstruction method that enables in vivo imaging of two different PET tracers simultaneously. This “multiplexed PET” technique, which increases the amount of information attainable during a single scan, can be implemented on preclinical or clinical PET systems without having to modify the hardware or image acquisition software.

n

Overview of multiplexed PET

n

Aiming to meet the ever-increasing clinical demand for PET scans, researchers at Ghent University in Belgium are developing a walk-through total-body PET scanner. Their proposed upright imaging system, which looks a bit like an airport security scanner, is expected to be both cheaper and quicker to use than standard PET instruments.

n

And researchers at UC Davis used total-body PET to perform first-in-human immunoPET imaging of T cell biodistribution in three healthy individuals and five patients recovering from COVID-19. Quantification of immune cell distribution and kinetics in humans can shed light on how the immune system responds to viral infections, and help researchers develop new vaccines and improved treatments.

n

Radiotherapy for the future

n

The introduction of MR-guided radiotherapy, which uses MRI to visualize tumours and surrounding organs with high accuracy while the patient is on the treatment table, enables clinical advances such as real-time plan modification and imaging of moving tumours during treatment delivery. But integrated MR-linac systems hold potential to do a lot more.

nn

Researchers at the University Hospital of Zurich investigated an approach called adaptive fractionation, which exploits inter-fraction motion (rather than simply compensating for it) by adjusting the prescribed dose according to the distance between the tumour and organs-at-risk (OAR) on each day. In other words, a patient is prescribed a higher radiation dose on days when their tumour–OAR separation is large and a lower dose on days when this separation is small.

n

Elsewhere, a team at the University of Toronto’s Sunnybrook Health Sciences Centre studied the use of diffusion-weighted imaging (DWI) on an MR-linac to improve treatment of the aggressive brain cancer glioblastoma. The idea here is to use DWI to identify regions of treatment-resistant tumour and deliver higher doses to these targets.

n

Upright radiotherapy

n

Another technology to keep a close eye on is the introduction of upright radiotherapy, pioneered by Leo Cancer Care. Back in January we reported on a patient positioning system that allows cancer patients to receive radiotherapy whilst sitting upright – in contrast to having to lie on their back – a position that should reduce organ movement during treatment and may also be more comfortable for the patient. Since then, several studies have confirmed the benefits of this upright approach for various tumour types, orders have been placed, and the positioning system is now pending 510(k) regulatory clearance for clinical use in the USA.

n

Wearable wonders

n

Each year we see the emergence of ingenious wearable devices for countless new healthcare monitoring and diagnostic applications; and 2023 was no exception.

n

Sarnab Bhattcharya and Nanshu Lu from The University of Texas at Austin

n

For starters, a team headed up at The University of Texas at Austin created an ultrathin, stretchable electronic tattoo that provides continuous cardiac monitoring. Attached to the chest via a medical dressing, the e-tattoo could detect early signs of heart disease outside of the clinic. Meanwhile, a wearable ultrasound transducer developed at the University of California San Diego could be used to monitor patients with serious cardiovascular conditions, as well as to help athletes keep track of their training.

nn

Other novel wearables reported this year include a ring device that accurately gauges how intensely its wearer is scratching their skin, a miniaturized ultrasound scanner that may provide earlier detection of breast cancer, and earbud biosensors that continuously and simultaneously measure the electrical activity of the brain and levels of lactate in sweat.

n

Finally, researchers from the Medical University of Vienna designed a prototype MRI coil that can be worn like a sports bra. The so-called BraCoil is a vest-like receive-only coil array made of flexible coil elements that enables 3 T MR imaging of patients in both supine (lying on their back) and prone (lying on their front) positions. Designed to improve comfort, and reduce preparation and acquisition time, the BraCoil also produced an up to three-fold improvement in signal-to-noise ratio compared with standard coils.

n

n

Reports on Progress in Physics logo

n

Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.

n

 

n

 

n

n

n

The post Medical physics and biotechnology: highlights of 2023 appeared first on Physics World.

n

]]>
https://hadamard.com/c/medical-physics-and-biotechnology-highlights-of-2023/feed/ 0 38
QUANT-NET’s testbed innovations: reimagining the quantum network https://hadamard.com/c/quant-nets-testbed-innovations-reimagining-the-quantum-network/ https://hadamard.com/c/quant-nets-testbed-innovations-reimagining-the-quantum-network/#respond Sat, 23 Dec 2023 10:00:14 +0000 https://physicsworld.com/?p=111903 Continue reading QUANT-NET’s testbed innovations: reimagining the quantum network]]> Today’s internet distributes classical bits and bytes of information over global, even interstellar, distances. The quantum internet of tomorrow, on the other hand, will enable the remote connection, manipulation and storage of quantum information – through distribution of quantum entanglement using photons – across physically distant quantum nodes within metropolitan, regional and long-haul optical networks. The opportunities are compelling and already coming into view for science, national security and the wider economy.

 

By exploiting the principles of quantum mechanics – superposition, entanglement and the “no-cloning” theorem, for example – quantum networks will enable all sorts of unique applications that are not possible with classical networking technologies. Think quantum-encrypted communication schemes for government, finance, healthcare and the military; ultrahigh-resolution quantum sensing and metrology for scientific research and medicine; and, ultimately, the implementation of at-scale, cloud-based quantum computing resources linked securely across global networks.

 

Right now, though, quantum networks are still in their infancy, with the research community, big tech (companies like IBM, Amazon, Google and Microsoft) and a wave of venture-financed start-ups all pursuing diverse R&D pathways towards practical functionality and implementation. A case study in this regard is QUANT-NET, a $12.5m, five-year R&D initiative that’s backed by the US Department of Energy (DOE), under the Advanced Scientific Computing Research programme, with the goal of constructing a proof-of-principle quantum network tested for distributed quantum computing applications.

 

Out of the lab, into the network

 

Collectively, the four research partners within the QUANT-NET consortium – Berkeley Lab (Berkeley, CA); University of California Berkeley (UC Berkeley, CA); Caltech (Pasadena, CA); and the University of Innsbruck (Austria) – are seeking to establish a three-node, distributed quantum computing network between two sites (Berkeley Lab and UC Berkeley). In this way, each of the quantum nodes will be linked up via a quantum entanglement communication scheme over pre-installed telecoms fibre, with all the testbed infrastructure managed by a custom-built software stack.

 

Optical set-up in the UC Berkeley physics lab

 

“There are many complex challenges when it comes to scaling up the number of qubits on a single quantum computer,” says Indermohan (Inder) Monga, QUANT-NET principal investigator and director of the scientific networking division at Berkeley Lab and executive director of Energy Sciences Network (ESnet), the DOE’s high-performance network user facility (see “ESnet: networking large-scale science”). “But if a larger computer can be built from a network of multiple smaller computers,” he adds, “could we perhaps fast-track the scaling of quantum computing capability – more qubits working in tandem essentially – by distributing quantum entanglement over a fibre-optic infrastructure? That’s the fundamental question we’re trying to answer within QUANT-NET.”

 

ESnet: networking large-scale science across the US and beyond

 

ESnet provides high-bandwidth network connections and services to multidisciplinary scientists across more than 50 research sites of the US Department of Energy (DOE) – including the entire National Laboratory system, its associated supercomputing resources and large-scale facilities – as well as peering with more than 270 research and commercial networks worldwide.

 

Inder Monga

Network effects Inder Monga, QUANT-NET principal investigator and executive director of ESnet. (Courtesy: Bart Nagel Photography) 

Funded by the DOE Office of Science, ESnet is a designated DOE User Facility managed and operated by the scientific networking division at Berkeley Lab. “We think of ESnet as the data circulatory system for the DOE,” says Inder Monga, ESnet executive director and head of the QUANT-NET project. “Our teams work closely with both DOE researchers and the international networking community as well as industry to develop open-source software and collaborative technical projects that will accelerate large-scale science.”

 

The positioning of QUANT-NET within Monga’s remit is no accident, tapping into the accumulated domain knowledge and expertise of the ESnet engineering teams on network architectures, systems and software. “The QUANT-NET goal is a 24/7 quantum network exchanging entanglement and mediated by an automated control plane,” notes Monga. “We are not going to get there in the scope of this limited R&D project, but this is where we’re heading from a vision perspective.”

 

Another motivation for Monga and colleagues is to take quantum communication technologies “out of the lab” into real-world networking systems that exploit telecoms fibres already deployed in the ground. “Current quantum networking systems are still essentially room-sized or table-top physics experiments, fine-tuned and managed by graduate students,” says Monga.

 

As such, one of the main tasks for the QUANT-NET team is to demonstrate field-deployable technologies that, over time, will be able to operate 24/7 without operator intervention. “What we want to do is build the software stack to orchestrate and manage all the physical-layer technologies,” Monga adds. “Or at least get some idea of what that software stack should look like in future so as to automate high-rate and high-fidelity entanglement generation, distribution and storage in an efficient, reliable, scalable and cost-effective way.”

 

Enabling quantum technologies

If the QUANT-NET end-game is to road-test the candidate hardware and software technologies for the quantum internet, it’s instructive from a physics perspective to unpack the core quantum building blocks that make up the testbed’s network nodes – namely, trapped-ion quantum computing processors; quantum frequency-conversion systems; and colour-centre-based, single-photon silicon sources.

 

With respect to the networking infrastructure, there’s already been significant progress on testbed design and implementation. The QUANT-NET testbed infrastructure is complete, including fibre construction (5 km in extent) between the quantum nodes plus the fitting out of a dedicated quantum networking hub at Berkeley Lab. Initial designs for the quantum network architecture and software stack are also in place.

Ion trap housed within its vacuum system (top), with close-up of a trap mounted to a printed circuit board

 

The engine-room of the QUANT-NET project is the trapped-ion quantum computing processor, which relies on the integration of a high-finesse optical cavity with a novel chip-based trap for Ca+ ion qubits. These trapped-ion qubits will connect via a dedicated quantum channel across the network testbed – in turn, creating long-distance entanglement between distributed quantum computing nodes.

“Demonstrating entanglement is key as it provides a link between the remote quantum registers that can be used to teleport quantum information between different processors or to execute conditional logic between them,” says Hartmut Häffner, who is a principal investigator on the QUANT-NET project with Monga, and whose physics lab on the UC Berkeley campus is the other node in the testbed. Equally important, the computing power of a distributed quantum computer scales significantly with the number of qubits that can be interconnected therein.

 

To entangle two remote ion traps across the network is far from straightforward, however. First, the spin of each ion must be entangled with the polarization of an emitted photon from its respective trap (see “Engineering and exploiting entanglement in the QUANT-NET testbed”). The high-rate, high-fidelity ion–photon entanglement in each case relies on single, near-infrared photons emitted at a wavelength of 854 nm. These photons are converted to the 1550 nm telecoms C-band to minimize fibre-optic losses impacting subsequent photon transmission between the UC Berkeley and Berkeley Lab quantum nodes. Taken together, trapped ions and photons represent a win–win, with the former providing the stationary computing qubits; the latter serving as “flying communication qubits” to link up the distributed quantum nodes.

 

At a more granular level, the quantum frequency-conversion module exploits established integrated photonic technologies and the so-called “difference frequency process”. In this way, an input 854 nm photon (emitted from a Ca+ ion) is mixed coherently with a strong pump field at 1900 nm in a nonlinear medium, yielding an output telecoms photon at 1550 nm. “Crucially, this technique preserves the quantum states of the input photons while providing high conversion efficiencies and low-noise operation for our planned experiments,” says Häffner.

 

With entanglement established between two nodes, the QUANT-NET team can then demonstrate the fundamental building block of distributed quantum computing, in which the quantum information in one node controls the logic in the other. In particular, entanglement and classical communication are used to teleport quantum information from the controlling node into the target node, where the process – such as a non-local, controlled NOT quantum logic gate – can then be executed with local operations only.

 

Engineering and exploiting quantum entanglement in the QUANT-NET testbed

 

The establishment of ion–ion entanglement between two trapped-ion quantum nodes relies on synchronous preparation of ion–photon entanglement (in the spin and polarization degrees of freedom) within each network node (1). The cycle starts with ion-state initialization, after which a laser pulse triggers emission of a near-infrared photon in the optical cavity of each ion trap. After quantum frequency conversion (2), the resulting telecoms photons (entangled with the respective ions) are sent towards a so-called Bell State Measurement (BSM) node in a bid to create ion–ion entanglement via measurements on the polarization states of the two photons (3). The process repeats (4) until both of the photons are transmitted successfully through their respective fibre and registered jointly at the BSM node, heralding the creation of ion–ion entanglement (5). This entanglement is stored until the quantum network requests to use it as a resource – for instance, to transmit quantum information via teleportation.

Finally, a parallel work package is under way to explore the impact of “heterogeneity” within the quantum network – acknowledging that multiple quantum technologies are likely to be deployed (and therefore interfaced with each other) in the formative stages of the quantum internet. In this regard, solid-state devices relying on silicon colour-centres (lattice defects that generate optical emission at telecoms wavelengths around 1300 nm) benefit from the inherent scalability of silicon nanofabrication techniques, while emitting single photons with a high level of indistinguishability (coherence) required for quantum entanglement.

“As a first step in this direction,” adds Häffner, “we plan to demonstrate quantum-state teleportation from a single photon emitted from a silicon colour-centre to a Ca+ qubit by alleviating the issue of spectral mismatch between these two quantum systems.”

 

The QUANT-NET roadmap

 

As QUANT-NET approaches its mid-way point, the goal for Monga, Häffner and colleagues is to characterize the performance of discrete testbed components independently, prior to integration and tuning of these elements into an operational research testbed. “With network system principles in mind, our focus will also be on automating the various elements of a quantum network testbed that typically might be manually tuned or calibrated in a lab environment,” says Monga.

 

Aligning QUANT-NET R&D priorities with other quantum networking initiatives around the world is also crucial – though differing, and perhaps incompatible, approaches will probably be the norm given the exploratory nature of this collective research endeavour. “We need many flowers to bloom for now,” Monga notes, “so that we can home in on the most promising quantum communication technologies and the associated network control software and architectures.”

 

Longer term, Monga wants to secure additional DOE funding, such that the QUANT-NET testbed can scale in terms of reach and complexity. “We hope that our testbed approach will enable easier integration of promising quantum technologies from other research teams and industry,” he concludes. “This in turn will provide for a rapid prototype–test–integrate cycle to support innovation…and will contribute to an accelerated understanding of how to build a scalable quantum internet that co-exists with the classical internet.”

 

Further reading

 

Inder Monga et al. 2023 QUANT-NET: A testbed for quantum networking research over deployed fiber. QuNet ’23, pp 31–37 (September 10–142023; New York, NY, US)

 

The post QUANT-NET’s testbed innovations: reimagining the quantum network appeared first on Physics World.

 

]]>
https://hadamard.com/c/quant-nets-testbed-innovations-reimagining-the-quantum-network/feed/ 0 39
In case you missed it: the 10 most popular physics stories of 2023 https://hadamard.com/c/in-case-you-missed-it-the-10-most-popular-physics-stories-of-2023/ https://hadamard.com/c/in-case-you-missed-it-the-10-most-popular-physics-stories-of-2023/#respond Fri, 22 Dec 2023 10:00:26 +0000 https://physicsworld.com/?p=111859 Continue reading In case you missed it: the 10 most popular physics stories of 2023]]> Physics isn’t a popularity contest, but the 10 most read articles published on the Physics World website in 2023 nevertheless make an interesting collection of highlights, lowlights and every kind of light in between. If you didn’t spot these stories when they first appeared, here’s your chance to find out what the fuss was about.

 

10. Early galaxies transformed the universe

 

We said in 2022 that the best was yet to come for the James Webb Space Telescope (JWST), and we weren’t wrong. In 2023, NASA/ESA’s shiny new infrared eye in the heavens spotted an ionized molecule that could be involved in the emergence of life; found pairs of rogue planets wandering through the Orion nebula; and refined our knowledge of redshifts in distant galaxies. Heck, it may even have seen “dark stars” – hypothetical objects powered by the annihilation of dark matter rather than boring old fusion reactions. But the JWST story that got the most attention from Physics World readers was science writer Rob Lea’s account of how early galaxies reionized the early universe. This reionization was one of the most important events in the history of astronomy, because it allowed light that would otherwise have been absorbed by hydrogen to travel stupendous distances through time and space to all manner of objects – including the telescope that discovered it. We think this is pretty exciting, which is why it’s also in our list of the year’s top 10 breakthroughs.

 

9. An unexpected link between classical and quantum mechanics

 

One of the great things about physics is that results from hundreds of years ago are often as valid – and as useful – today as they were when physicists first came up with them. Take Christiaan Huygens, whose biggest discoveries came in the two most popular fields of 17th-century physics: optics and mechanics. Three and a half centuries later, Xiao-Feng Qian and Misagh Izadi of the Stevens Institute of Technology found an unexpected connection between these areas of Huygens’ work. By analysing two optical coherence properties, Qian and Izadi showed that they are quantitatively related to centre of mass and moment of inertia through the Huygens-Steiner theorem for rigid body rotation. Neat, eh?

 

8 and 7. A record-breaking lithium battery and concentrated solar reactor

 

This year was almost certainly the hottest since records began, and the outcome of December’s COP28 summit on climate change won’t change the planet’s alarming trajectory any time soon. Still, Physics World readers took heart from two positive green technology developments in 2023: a lithium battery in China with the highest energy density on record, and a concentrated solar reactor in Switzerland that produces “green” hydrogen at a rate of more than 2 kilowatts and efficiencies above 20%. Both innovations are ably described in this pair of stories by Physics World corresponding editor Isabelle Dumé.

 

Prism scattering light

 

6. An iconic but impossible prism

 

The prism-and-rainbow cover of Pink Floyd’s 1973 album Dark Side of the Moon is an iconic piece of art, but as a piece of physics, the band might as well have set the controls for the heart of the Sun. In this light-hearted essay written to mark the album’s 50th anniversary, physics teacher Tom Tierney describes how he challenged his students to measure the prism’s angle of refraction and find a real material that corresponds to it. The closest possibility, it seems, is the mineral zincite, but ultimately this cover design sent scientific accuracy off to a great gig in the sky.

 

4. Predicting the Nobel Prize for Physics

 

As the physics Nobel laureate Niels Bohr was (allegedly) fond of saying, it’s hard to make predictions, especially about the future. But that hasn’t stopped the Physics World editorial team from trying to predict the winners of each year’s Nobel Prize for Physics, and in 2023 we got our predictions partly right, correctly including Anne L’Huillier and Ferenc Krausz (though not their co-laureate Pierre Agostini) in our list of possible physics-prize winners. Predicting the 2023 Nobel Prize for Chemistry, meanwhile, was easier than usual thanks to an unprecedented leak of the winners’ names several hours before the official announcement. We’re guessing that winners Moungi Bawendi, Louis Brus and Alexei Ekimov aren’t complaining.

 

3. The death of a 130-year-old technology firm

 

The usual trajectory of a hot technology start-up resembles that of a rock-n-roll star: they live fast, they die young, and depending on how well they’re managed, they either go out in a blaze of IPO glory or get crushed under a pile of federal criminal charges. For some businesses, though, failure takes much longer, and William D Cohan’s book Power Failure: the Rise and Fall of General Electric (reviewed here by Anita Chandran) offers an in-depth analysis of how it happened at one of the 20th century’s biggest technology firms.

 

5 and 2. A room-temperature superconductor? Not so fast

 

For a fortnight at the end of July and the beginning of August, the scientific world – or at least the part of it that spends too much time on social media – went gaga over claims that researchers in Korea had synthesized the first room-temperature superconductor. A material that conducts electricity without resistance under everyday conditions (as opposed to liquid nitrogen temperatures or millions of atmospheres of pressure) would be a major technological breakthrough, so it’s easy to see why the announcement attracted attention. Alas, a flurry of attempts to replicate the supposedly superconducting behaviour of the material called LK-99 came to nothing, and materials scientists had to find something else to do with the rest of their summer holidays.

 

Which brings us to the most popular article published on the Physics World site this year:

 

Oppenheimer movie image

 

1. The Bomb goes to the box office

 

It isn’t every year that a physics-related film becomes a summer blockbuster, so it’s not surprising that our review of Christopher Nolan’s biopic Oppenheimer got plenty of attention from Physics World readers. The jury is out on whether a movie about the deep moral ambiguities of atomic weapons could ever be considered “good advertising” for physics as a discipline, but it certainly got people flocking to the cinema. The film was a massive box-office hit, thanks in part to the “Barbenheimer” phenomenon that saw it paired with the summer’s other big cinematic success story, Barbie, in a surreal double billing. Though the Physics World editorial team’s review of Oppenheimer lacked the star power of Cillian Murphy, Margot Robbie and Ryan Gosling, it nevertheless took the crown as the most-read article published on the Physics World website in 2023.

 

The post In case you missed it: the 10 most popular physics stories of 2023 appeared first on Physics World.

]]>
https://hadamard.com/c/in-case-you-missed-it-the-10-most-popular-physics-stories-of-2023/feed/ 0 40
Superconducting electrode controls spin waves in a magnet https://hadamard.com/c/superconducting-electrode-controls-spin-waves-in-a-magnet/ https://hadamard.com/c/superconducting-electrode-controls-spin-waves-in-a-magnet/#respond Thu, 21 Dec 2023 10:00:45 +0000 https://physicsworld.com/?p=111930 Continue reading Superconducting electrode controls spin waves in a magnet]]> Placing a superconducting electrode on top of a thin magnet makes it possible to manipulate and control so-called “spin waves” within the magnet simply by changing the electrode’s temperature. This result, from quantum physicists at Delft University of Technology in the Netherlands, could advance the development of spintronics devices, which exploit the spin of an electron as well as its charge.

n

Spin waves are collective oscillations of magnetic order in magnetic materials, and they show much promise for spintronics because they can travel millimetres or even centimetres in some media with very little loss. This means they could transmit electrical signals over long distances while using less energy than conventional electronics. They can also be manipulated to perform many calculations or operations before the signal from them fades out, which is important for practical devices.

nn

The main problem with spin waves is that they are hard to control. However, researchers led by Toeno van der Sar and Michael Borst have now shown that it is possible to do this in a magnetic thin film using a superconductor. In their study, which they describe in Science, they started with a chip covered by a thin magnetic film of yttrium iron garnet (YIG). On top of this film they placed a gold electrode, which they used to excite spin waves in the YIG. They then placed a superconducting electrode next to the gold electrode and studied how the spin waves travelled underneath it.

n

Controlling where and how the spin waves propagate

n

While theory predicts that normal (non-superconducting) metal electrodes should be able to control the wavelength and propagation of spin waves, the group’s previous work revealed that such electrodes “primarily dampen out spin waves and don’t provide this control at all”, Borst explains. He and his colleagues were thus very interested to find out whether a superconductor would give a different result – which it did.

n

“For the electrode to become superconducting, we cooled the chip to below 9 K and when it became so, we suddenly observed a dramatic change in the spin wavelength,” Borst says. “We found that by changing the temperature of the electrode, we could accurately tune this wavelength. And by creating a temperature gradient in the electrode, we could control where and how the spin waves propagate.”

n

Monitoring propagation

n

One major challenge the team had to overcome was finding a way to monitor how spin waves propagate under the electrode. This is not an easy task, but the researchers addressed it by creating a unique magnetic field sensor based on electron spins in diamond that allows them to observe the spin waves directly. “This is a powerful technique that will surely come in useful for characterizing more complex metal-covered spin-wave devices in the future,” Borst tells Physics World.

n

According to the Delft University of Technology team, the new work could make it possible to create many kinds of spin-wave circuits and devices, such as on-chip spin-wave cavities, spin-wave reflectors and spin-wave gratings.

nn

“Interestingly, we can also learn about important properties of the superconductor by studying these waves,” Borst says. “Indeed, we have demonstrated this by mapping one such fundamental parameter, the superconductor’s London penetration depth (the depth at which an external magnetic field penetrates into a superconductor), as a function of temperature.”

n

Looking forward, the researchers are now working out ways of developing real-world spin-wave devices and studying how the superconductor interacts with different types of spin waves. “We would also like to further our control over spin-wave propagation by introducing complex temperature gradients in the superconducting electrode,” Borst says.

n

The post Superconducting electrode controls spin waves in a magnet appeared first on Physics World.

n

]]>
https://hadamard.com/c/superconducting-electrode-controls-spin-waves-in-a-magnet/feed/ 0 41
What would happen if communication systems broke down? https://hadamard.com/c/what-would-happen-if-communication-systems-broke-down/ https://hadamard.com/c/what-would-happen-if-communication-systems-broke-down/#respond Wed, 20 Dec 2023 11:00:57 +0000 https://physicsworld.com/?p=111466 Continue reading What would happen if communication systems broke down?]]> “A sight never to be forgotten.”

n

“Heaven became illuminated.”

n

“Nothing could exceed the grandeur and the beauty.”

n

These are just some of the phrases used by eyewitnesses to describe the remarkable aurora that danced over much of the globe for three special nights in early September 1859. Visible at unprecedented low latitude locations, including Colombia, Hawaii and Queensland, the light show was the result of the most intense geomagnetic storm in recorded history. Dubbed “The Carrington Event”, the episode was triggered by the direct collision between Earth’s magnetosphere and a major coronal mass ejection from the Sun.

n

Astonishing phenomena were induced – both literally and figuratively – in the telegraph networks of Europe and North America, and the transatlantic cable that freshly connected them. Currents induced in the cables caused telegraph pylons to spark, some operators reported receiving electric shocks, and many connections failed completely. Other lines, meanwhile, were found to function even once the power to them had been cut.

n

While the fibre-optic cables that make up the backbone of today’s Internet are, given their composition, immune to the electromagnetic fluctuations of solar storms, the same cannot be said of signal boosters, which punctuate undersea cables to ensure connection can be supported over long distances. Moreover, a major space weather event today could also disrupt radio communications, interfere with satellite operations and take out power grids.

n

That’s not as unlikely as it sounds – in early 1989 a solar storm triggered by a coronal mass ejection famously plunged nine million people in Quebec, Canada, into a blackout that lasted around nine hours. Some astrophysicists have estimated that there is roughly a 2–12% chance that a solar storm hitting Earth in the next decade could cause catastrophic disruption to modern society.

n

From fact to fiction

n

The effect of such a modern-day Carrington Event is explored in Sigh No More, one of the engrossing tales in Communications Breakdown: SF Stories about the Future of Connection – a science-fiction anthology compiled by Hugo Award-winning publisher and editor Jonathan Strahan as part of MIT Press’s Twelve Tomorrows series. The book presents 10 short stories on the future of communication and the pitfalls of inequalities in it. It also includes an interview with surveillance and privacy researcher Chris Gilliard of the Shorenstein Center on Media, Politics and Public Policy.

nn

Written by Ian McDonald, Sigh No More (whose title will be appreciated by theatre aficionados) takes an indirect look at the effect of a series of catastrophic space weather phenomena through the lens of a plucky community theatre production of Much Ado About Nothing. Undaunted by the modern apocalypse, these “idiots trying to put on Shakespeare” overcome enduring blackouts, paralysed transportation systems and muggers exploiting the return of a hard-cash economy, to go “full Bard” in Millwall Park in a neat little ending that relies on some aspects borrowed from the original Carrington Event.

n

Influencing the fabric of Sigh No More is the fact that Communications Breakdown was compiled in the wake of COVID-19. In fact, the pandemic is referenced multiple times and reverberates throughout the anthology – perhaps because that socially isolating period highlighted the importance of modern communication systems. Sigh No More envisages a reversal of this situation, imagining that “When the Sun blew a ten-billion-ton plasma kiss at Earth, there were no online quizzes, no Microsoft Teams meetings, no Zoom play-readings, no tweeting on shared Netflix experiences. The Event shut down human communications but opened a thousand doors to human contact.”

n

Battling corruption

n

Another story that is likely to picque the interest of the scientist reader is Premee Mohamed’s At Every Door A Ghost. In this tale, a pair of researchers turn to covert research in the wake of an AI-driven chemical weapons attack that sees the production of scientific knowledge both constrained and aggressively surveilled.

n

In fact, many of the stories pit their protagonists against overbearing, corrupt and uncaring systems. For example, in Company Man by Shiv Ramdas the enemy is a medical device firm whose bizarre, impersonal and crushing administrative ethos is straight out of a Franz Kafka novel; while in Moral Hazard by Cory Doctorow it’s a Supreme Court decision that sees all weather warnings placed behind paywalls. (The latter focuses on hacking and punk subculture, which, along with its setting, brought to mind Neal Stephenson’s iconic novel Snow Crash.) It also feels pointed that the two works feature ordinary people becoming corporations to acquire the power not permitted to them under the decidedly neoliberal status quo of their narratives.

n

As Strahan himself notes in his forward, while such moments in the anthology’s stories can be seen as “dark or depressing”, they also “show the possibility of solutions, of things getting better, of improvement”. Or, as the cast of the Millwall Much Ado might have put it – if “all the world’s a stage, and all the men and women merely players”, then the show must go on!

n

    n

  • 2023 MIT Press 224pp £21hb
  • n

n

The post What would happen if communication systems broke down? appeared first on Physics World.

n

]]>
https://hadamard.com/c/what-would-happen-if-communication-systems-broke-down/feed/ 0 42
The science and art of complex systems https://hadamard.com/c/the-science-and-art-of-complex-systems/ https://hadamard.com/c/the-science-and-art-of-complex-systems/#respond Wed, 20 Dec 2023 05:00:00 +0000 https://news.mit.edu/2023/gosha-geogdzhayev-climate-modeling-1220 Continue reading The science and art of complex systems]]> <p>As a high school student, Gosha Geogdzhayev attended Saturday science classes at Columbia University, including one called The Physics of Climate Change. “They showed us a satellite image of the Earth’s atmosphere, and I thought, ‘Wow, this is so beautiful,’” he recalls. Since then, climate science has been one of his driving interests.</p>nn<p>With the MIT <a href=”https://eapsweb.mit.edu/” target=”_blank”>Department of Earth, Atmospheric and Planetary Sciences</a> and the <a href=”https://climategrandchallenges.mit.edu/flagship-projects/bringing-computation-to-the-climate-challenge/” target=”_blank”>BC3 Climate Grand Challenges project</a>, Geogdzhayev is creating climate model “emulators” in order to localize the large-scale data provided by global climate models (GCMs). As he explains, GCMs can make broad predictions about climate change, but they are not proficient at analyzing impacts in localized areas. However, simpler “emulator” models can learn from GCMs and other data sources to answer specialized questions. The model Geogdzhayev is currently working on will project the frequency of extreme heat events in Nigeria.</p>nn<p>A senior majoring in physics, Geogdzhayev hopes that his current and future research will help reshape the scientific approach to studying climate trends. More accurate predictions of climate conditions could have benefits far beyond scientific analysis, and affect the decisions of policymakers, businesspeople, and truly anyone concerned about climate change.</p>nn<p>“I have this fascination with complex systems, and reducing that complexity and picking it apart,” Geogdzhayev says.</p>nn<p></p>nn<p>His pursuit of discovery has led him from Berlin, Germany, to Princeton, New Jersey, with stops in between. He has worked with Transsolar KlimaEngineering, NASA, NOAA, FU Berlin, and MIT, including through the MIT Climate Stability Consortium’s Climate Scholars Program, in research positions that explore climate science in different ways. His projects have involved applications such as severe weather alerts, predictions of late seasonal freezes, and eco-friendly building design.&nbsp;</p>nn<p></p>nn<p><strong>The written word</strong></p>nn<p></p>nn<p>Originating even earlier than his passion for climate science is Geogdzhayev’s love of writing. He recently discovered original poetry dating back all the way to middle school. In this poetry he found a coincidental throughline to his current life: “There was one poem about climate, actually. It was so bad,” he says, laughing. “But it was cool to see.”</p>nn<p></p>nn<p>As a scientist, Geogdzhayev finds that poetry helps quiet his often busy mind. Writing provides a vehicle to understand himself, and therefore to communicate more effectively with others, which he sees as necessary for success in his field.</p>nn<p></p>nn<p>“A lot of good work comes from being able to communicate with other people. And poetry is a way for me to flex those muscles. If I can communicate with myself, and if I can communicate myself to others, that is transferable to science,” he says.</p>nn<p></p>nn<p>Since last spring Geogdzhayev has attended poetry workshop classes at Harvard University, which he enjoys partly because it nudges him to explore spaces outside of MIT.</p>nn<p></p>nn<p>He has contributed prolifically to platforms on campus as well. Since his first year, he has written as a <a href=”https://mitadmissions.org/blogs/author/mgeo/” target=”_blank”>staff blogger for MIT Admissions</a>, creating posts about his life at MIT for prospective students. He has also written for the yearly fashion publication “<a href=”https://infinitemagazine.mit.edu/” target=”_blank”>Infinite Magazine.</a>”</p>nn<p></p>nn<p>Merging both science and writing, a peer-reviewed publication by Geogdzhayev will soon be published in the journal “Physica D: Nonlinear Phenomena.” The piece explores the validity of climate statistics under climate change through an abstract mathematical system.</p>nn<p></p>nn<p><strong>Leading with heart </strong></p>nn<p></p>nn<p>Geogdzhayev enjoys being a collaborator, but also excels in leadership positions. When he first arrived at MIT, his dorm, <a href=”https://bc.mit.edu/” target=”_blank”>Burton Conner</a>, was closed for renovation, and he could not access that living community directly. Once his sophomore year arrived however, he was quick to volunteer to streamline the process to get new students involved, and eventually became floor chair for his living community, Burton 1.</p>nn<p></p>nn<p>Following the social stagnation caused by the Covid-19 pandemic and the dorm renovation, he helped rebuild a sense of community for his dorm by planning social events and governmental organization for the floor. He now regards the members of Burton 1 as his closest friends and partners in “general tomfoolery.”</p>nn<p></p>nn<p>This sense of leadership is coupled with an affinity for teaching. Geogdzhayev is a peer mentor in the Physics Mentorship Program and taught climate modeling classes to local high school students as a part of <a href=”https://pk12.mit.edu/activities/splash/” target=”_blank”>SPLASH</a>. He describes these experiences as “very fun” and can imagine himself as a university professor dedicated to both teaching and research.</p>nn<p></p>nn<p>Following graduation, Geogdzhayev intends to pursue a PhD in climate science or applied math. “I can see myself working on research for the rest of my life,” he says.</p>

]]>
https://hadamard.com/c/the-science-and-art-of-complex-systems/feed/ 0 202
Radiant chills: the revolutionary science of laser cooling https://hadamard.com/c/radiant-chills-the-revolutionary-science-of-laser-cooling/ https://hadamard.com/c/radiant-chills-the-revolutionary-science-of-laser-cooling/#respond Tue, 19 Dec 2023 11:00:37 +0000 https://physicsworld.com/?p=111892 Continue reading Radiant chills: the revolutionary science of laser cooling]]> n

Over the past half century, laser cooling has revolutionized atomic, molecular and optical physics. Laser cooling of atoms and ions has enabled dramatic leaps in the precision of atomic clocks, allowing new tests of fundamental physics and potential improvements in clock-based navigation via the Global Positioning System. Now it is also laying the foundations for quantum computing with atoms and ions.

n

In this episode of Physics World Stories, you can enjoy a vibrant tour through the history of laser cooling with Chad Orzel, a popular-science author and researcher at Union College in the US, who is in conversation with Andrew Glester. Orzel describes the key research breakthroughs – which have led to several Nobel prizes – but also the personal stories behind the discoveries, involving physics titans such as Hal Metcalf, Bill Phillips and Steven Chu.

n

You can learn more about this topic via a trilology of features that Chad Orzel has written for Physics World. The final instalment will be available in January and you can already read the first two articles:

n

n

The post Radiant chills: the revolutionary science of laser cooling appeared first on Physics World.

n

]]>
https://hadamard.com/c/radiant-chills-the-revolutionary-science-of-laser-cooling/feed/ 0 47
Quantum simulator visualizes large-scale entanglement in materials https://hadamard.com/c/quantum-simulator-visualizes-large-scale-entanglement-in-materials/ https://hadamard.com/c/quantum-simulator-visualizes-large-scale-entanglement-in-materials/#respond Thu, 14 Dec 2023 10:07:18 +0000 https://physicsworld.com/?p=111833 Continue reading Quantum simulator visualizes large-scale entanglement in materials]]> Artist's illustration showing a magnifying glass suspended over a grey surface of a material. Brightly coloured particles - red, blue, purple and orange, representing different temperatures - are popping out of the material and passing through the magnifying glass

n

Physicists in Austria have found a quick and efficient way of extracting information on a quantum material’s large-scale entanglement structure thanks to a 50-year-old theorem from quantum field theory. The new method could open doors in fields such as quantum information, quantum chemistry or even high-energy physics.

n

Quantum entanglement is a phenomenon whereby the information contained in an ensemble of particles is encoded in correlations among them. This information cannot be accessed by probing the particles individually, and it is an essential feature of quantum mechanics, one that clearly distinguishes the quantum from the classical world. As well as being pivotal for quantum computing and quantum communication, entanglement heavily influences the properties of an emerging class of exotic materials. A deeper understanding of it could therefore help scientists understand and solve problems in materials science, condensed-matter physics and beyond.

n

The problem is that learning about the internal entanglement of a large number of entangled particles is notoriously hard, since the complexity of the correlations increases exponentially with the number of particles. This complexity makes it impossible for a classical computer to simulate materials made from such particles. Quantum simulators are better equipped for this task, as they can represent the same exponential complexity as the target material they are simulating. However, extracting the entanglement properties of a material with standard techniques still requires an intractably large number of measurements.

n

Quantum simulator

n

In their new, more efficient method for evaluating the strength of a system’s entanglement, researchers from the University of Innsbruck and the nearby Institute of Quantum Optics and Quantum Information (IQOQI) interpreted entanglement strength in terms of a local temperature. While highly entangled regions of the quantum material appear “hot” in this method, weakly entangled regions appear “cold”. Crucially, the exact form of this locally varying temperature field is predicted by quantum field theory, enabling the team to measure temperature profiles more efficiently than was possible with previous methods.

nn

To simulate an entangled quantum material, the Innsbruck-IQOQI team used a system of 51 40Ca+ ions held in place inside a vacuum chamber by the oscillating electric field of a device called linear Paul trap. This setup allows each ion to be individually controlled and its quantum state read out with high accuracy. The researchers could quickly determine the right temperature profiles by placing a feedback loop between the system and a (classical) computer that is constantly generating new profiles and is comparing them with the actual measurements in the experiment. They then made measurements to extract properties such as the system’s energy. Finally, they investigated the internal structure of the system’s states by studying the “temperature” profiles, which enabled them to determine the entanglement.

n

Hot and cold regions

n

The temperature profiles the team obtained show that regions that are strongly correlated with surrounding particles can be considered “hot” (that is, highly entangled) and those that interact very little can be considered “cold” (weakly entangled). The researchers also confirmed, for the first time, predictions of quantum field theory as adapted to ground states (or low temperature states) of materials via the Bisognano-Wichmann theorem, which was first put forward in 1975 as a way of relating certain Lorentz transformations in spacetime to transformations in charge, parity and time. In addition, the method enabled them to visualize the crossover from weakly entangled ground states to strongly entangled excited states of the quantum material.

n

Team leader Peter Zoller, who holds positions at both Innsbruck and the IQOQI, says that the results and the techniques – quantum protocols running on a quantum simulator – used to obtain them are generally applicable to the simulation of quantum materials. For this reason, he believes they hold broad importance for quantum information science and technology as well as quantum simulation. “For future experiments we [would] like to do this with other platforms and more complicated/interesting model systems,” he tells Physics World. “Our tools and techniques are very general.”

nn

Marcello Dalmonte, a physicist at the Abdus Salam International Centre for Theoretical Physics in Italy who was not involved in the research, calls the results “a true ground-breaker”. In his view, the method brings our experimentally testable understanding of entanglement to a new level by unveiling its full complexity. He also thinks the technique will improve our understanding of the relationship between entanglement and physical phenomena, and is excited by the possibility of using it to solve key questions in theoretical physics, such as reaching a better understanding of the operator entanglement structure for mixed states. Another possible area to explore might be the mutual entanglement between chunks of matter, though Dalmonte adds that this would require further improvements to the protocol, including boosting its scalability.

n

The research is described in Nature.

n

The post Quantum simulator visualizes large-scale entanglement in materials appeared first on Physics World.

n

]]>
https://hadamard.com/c/quantum-simulator-visualizes-large-scale-entanglement-in-materials/feed/ 0 60
Portable optical atomic clock makes its commercial debut https://hadamard.com/c/portable-optical-atomic-clock-makes-its-commercial-debut/ https://hadamard.com/c/portable-optical-atomic-clock-makes-its-commercial-debut/#respond Tue, 12 Dec 2023 15:00:39 +0000 https://physicsworld.com/?p=111785 Continue reading Portable optical atomic clock makes its commercial debut]]> Atoms are the world’s most precise timekeepers – so much so that the second is defined as exactly 9 192 631 770 ticks of a caesium-based atomic clock. Commercially-available versions of these atomically precise clocks underpin GPS, navigation, data transfer and financial markets, and they run at microwave frequencies, or billions of tick-tocks per second. After a day, their timekeeping is out by less than ten nanoseconds.

nn

As good as this is, though, the next generation of atomic clocks is even more precise. These lab-based constructions run at optical frequencies, meaning they tick tens of trillions of times per second. The best of them can remain precise to 10 femtoseconds (10-15 s) after a day, or within a second after 50 billion years. And soon, for the first time, you’ll be able to buy one of your very own: Vector Atomic, a start-up based in California, US, has put the first portable optical clock on the market.

n

“Today the only clocks you can buy are microwave clocks,” says Jonathan Hoffman, a programme manager at the US Defense Advanced Research Projects Agency (DARPA), which funded the work. “If you go to the optical transition, there’s a giant gain in precision, accuracy and performance, but it also typically comes with incredible complexity at the same time. Finding a happy compromise is the real battle.”

n

Finding the right atoms

n

The main difference between optical clocks and their microwave predecessors is lasers. To build the most precise clocks possible, scientists use the atoms that offer the narrowest atomic transitions – usually strontium or ytterbium – and design their laser systems around those atoms’ particular requirements. The atoms are kept in vacuum chambers, and different lasers are used to cool and trap them, while other lasers block undesirable transitions or interrogate the desired one used in the clock. All these lasers, up to a dozen total, need to be stabilized to precise frequencies, and maintaining them requires constant supervision.

n

To build a less precise, but more robust and portable, version of an optical clock, Vector Atomic CEO and co-founder Jamil Abo-Shaeer had to take a different approach. “Instead of designing the system around the atom, we designed the system around the lasers,” he says.

n

Photo of a frequency comb in a rectangular box

n

The toughest, most time-tested lasers in existence, Abo-Shaeer explains, are those used in telecommunications and industrial machining. Thanks to years (or even decades) of commercial R&D, they are extremely compact and stable, and he and his team chose an atomic species that suits them: molecular iodine. This molecule has convenient transitions near a frequency-doubled infrared laser commonly used in machining. The team also opted for a simple vapour-cell setup that avoids cooling the atoms to frigid temperatures or confining them in an ultrahigh vacuum.

n

The result was a turnkey optical clock, which the team call Evergreen, with a volume of just 30 litres – roughly the size of a record player. Although the precision of Evergreen’s timing is far from the lab-based state of the art, it is 100 times more precise than existing microwave clocks of a comparable size. It also matches the performance of clocks based on hydrogen masers – devices the size of walk-in fridges that are extremely sensitive to environmental noise.

n

Sea trials

n

In the summer of 2022, a prototype of Evergreen spent three weeks aboard a ship at sea for testing. During this time, the clock worked without any intervention. Upon return, the team tested the clock’s performance and found it had not significantly degraded, despite turbulence and temperature swings aboard ship. “When it happened, I thought everyone should be standing up and shouting from the rooftops,” Hoffman says. “I mean, people have been working on these optical clocks for decades. And this was the first time an optical clock ran on its own without human interference, out in the real world.”

n

Photo of Vector Atomic's optical clock, an oblong grey box with a display screen and a handful of connectors

n

According to Abo-Shaeer, Evergreen’s size and stability pave the way for widespread adoption of such clocks in navigation, especially when GPS signals are blocked or spoofed; in data centres and telecommunications protocols; and for synchronizing signals from remote detectors for scientific purposes. Currently, GPS is precise to about three metres, but more precise timing on satellites could bring that down to a few centimetres or less, allowing autonomous vehicles to stay in their lanes or delivery drones to land on a balcony. Being able to chop time up into smaller pieces should also allow for higher bandwidth communications, Abo-Shaeer adds.

nn

Whether this particular clock is the one that will power the next generation of GPS and faster data transfer remains to be seen. But the technological advance is significant nonetheless, says Elizabeth Donley, the head of the US National Institute of Standards and Technology (NIST) time and frequency division in Boulder, Colorado. “There’s potentially a lot of other types of optical clocks that could come on the market over the next decade,” says Donley, who was not involved in Vector Atomic’s work. “The heart of this thing is an iodine vapour cell, but the infrastructure can be used for other types of clocks as well.”

n

The post Portable optical atomic clock makes its commercial debut appeared first on Physics World.

n

]]>
https://hadamard.com/c/portable-optical-atomic-clock-makes-its-commercial-debut/feed/ 0 65
Researchers grapple with bringing quantum security to the cloud https://hadamard.com/c/researchers-grapple-with-bringing-quantum-security-to-the-cloud/ https://hadamard.com/c/researchers-grapple-with-bringing-quantum-security-to-the-cloud/#respond Tue, 12 Dec 2023 12:00:14 +0000 https://physicsworld.com/?p=111771 Continue reading Researchers grapple with bringing quantum security to the cloud]]> A new protocol for cloud-computing-based information storage that could combine quantum-level security with better data-storage efficiency has been proposed and demonstrated by researchers in China. The researchers claim the work, which combines existing techniques known as quantum key distribution (QKD) and Shamir’s secret sharing, could protect sensitive data such as patients’ genetic information in the cloud. Some independent experts, however, are sceptical that it constitutes a genuine advance in information security.

n

The main idea behind QKD is to encrypt data using quantum states that cannot be measured without destroying them, and then send the data through existing fibre-optic networks within and between major metropolitan areas. In principle, such schemes make information transmission absolutely secure, but on their own, they only allow for user-to-user communication, not data storage on remote servers.

nn

Shamir’s secret sharing, meanwhile, is an algorithm developed by the Israeli scientist Adi Shamir in 1979 that can encrypt information with near-perfect security. In the algorithm, an encrypted secret is dispersed between multiple parties. As long as a specific fraction of these parties remain uncompromised, each party can reconstruct absolutely nothing about the secret.

n

Secure and efficient cloud storage

n

Dong-Dong Li and colleagues at the University of Science and Technology of China (USTC) in Hefei and the spinout company QuantumCTek have combined these two technologies into a protocol that utilizes Shamir’s secret sharing to encrypt data stored in the cloud and resists outside intruders. Before uploading data to the central server, an operator uses a quantum random number generator to generate two bitstreams called K and R. The operator uses K to encrypt the data and then deletes it. R serves as an “authentication” key: after encrypting the data, the user inserts a proportion of bitstream R into the ciphertext and uploads it to a central server, retaining the remainder locally. The proportion the user uploads must be below the Shamir threshold.

n

In the next step, the central server performs what’s known as erasure coding on the ciphertext. This divides the data into packets sent on to remote servers. To ensure against loss of information, the system needs a certain amount of redundancy. The current standard cloud storage technique, storage mirroring, achieves this by storing complete copies of the data on multiple servers. In Li and colleagues’ chosen technique, the redundant data blocks are instead scattered between servers. This has two advantages over storage mirroring. First, it reduces storage costs, since less redundancy is required; secondly, compromising one server does not lead to a complete data leak, even if the encryption algorithm is compromised. “Erasure coding is characterized by high fault tolerance, scalability and efficiency. It achieves highly reliable data recovery with smaller redundant blocks,” the researchers tell Physics World.

n

When a user wishes to recover the original data, the central server requests the data blocks from randomly chosen remote servers, reconstructs it and sends it in encrypted form back to the original user, who can recover the encryption key K and decrypt the message because they have the proportion of R that was originally retained locally as well as that which was inserted into the message. A hacker, however, could only obtain the part that was uploaded. The researchers write that they conducted a “minimal test system to verify the functionality and performance of our proposal” and that “the next step in developing this technology involves researching and validating multi-user storage technology. This means we will be focusing on how our system can effectively and securely handle data storage for multiple users.”

n

Further work needed

n

Barry Sanders, who directs the Institute for Quantum Science and Technology at the University of Calgary in Canada, describes a paper on the work in AIP Advances as “a good paper discussing some issues concerning how to make cloud storage secure in a quantum sense”. However, he believes more specifics are necessary. In particular, he would like to see a real demonstration of a distributed cloud storage system that meets the requirements one would expect in cybersecurity.

n

“They don’t do that, even in the ideal sense,” says Sanders, who holds an appointment at USTC but was not involved in this work. “What is the system you’re going to create? How does that relate to other systems? What are the threat models and how do we show that adversaries are neutralized by this technique? None of these are evident in this paper.”

nn

Renato Renner, who leads a quantum information theory research group at ETH Zurich, Switzerland, is similarly critical. “The positive part [of the paper] is that it at least tries to combine quantum-inspired protocols and integrate them into classical crytographic tasks, which is something one doesn’t see very often,” he says. “The issue I have is that this paper uses many techniques which are a priori completely unrelated – secret sharing is not really related to QKD, and quantum random number generation is different from QKD – they mix them all together, but I don’t think they make a scientific contribution to any of the individual ingredients: they just compose them together and say that maybe this combination is a good way to proceed.”

n

Like Sanders, Renner is also unconvinced by the team’s experimental test. “Reading it, it’s just a description of putting things together, and I really don’t see an added value in the way they do it,” he says.

n

The post Researchers grapple with bringing quantum security to the cloud appeared first on Physics World.

n

]]>
https://hadamard.com/c/researchers-grapple-with-bringing-quantum-security-to-the-cloud/feed/ 0 66
Toby Cubitt: why algorithms will speed up applications of quantum computers https://hadamard.com/c/toby-cubitt-why-algorithms-will-speed-up-applications-of-quantum-computers/ https://hadamard.com/c/toby-cubitt-why-algorithms-will-speed-up-applications-of-quantum-computers/#respond Tue, 12 Dec 2023 11:00:51 +0000 https://physicsworld.com/?p=111441 Continue reading Toby Cubitt: why algorithms will speed up applications of quantum computers]]> Quantum computers show great promise because they could, at least in principle, solve certain problems that cannot be cracked even by the most powerful conventional supercomputers. But building quantum bits, or qubits – and linking them to create practical quantum computers – is a huge challenge. In particular, quantum computers are incredibly noisy, which quickly introduces errors into quantum calculations.

n

That’s why many researchers are developing clever quantum algorithms that can do useful calculations even on today’s small, noisy quantum computers. One company contributing to that effort is Phasecraft, which was spun off from University College London and the University of Bristol in 2019. The physicist Toby Cubitt, co-founder and chief technology officer at Phasecraft, talks to Hamish Johnston about how real-world applications could be just around the corner.

n

Why did you originally set up Phasecraft?

n

We founded Phasecraft because quantum computing was reaching the point where quantum-computing hardware was no longer just a toy system, but pushing the boundaries of what could be done on conventional computers. We wanted to try to develop the algorithms needed to make use of that early-stage hardware and make quantum applications a reality. That’s a huge challenge scientifically, but a fascinating one to be involved in.

n

How big is the company at the moment?

n

We currently have about 20 full-time staff, roughly a third of whom have a background in quantum computing or quantum information theory, a third in materials science, condensed matter and chemistry, and a third on the computing side. They all have a knowledge of quantum computing, but are also very, very good at – and love – programming this stuff, and implementing it, and getting it working on the hardware.

n

We sponsor PhD students who are at places like University College London and the University of Bristol but who work directly here in the company’s offices. We also have lots of interns – both undergraduates and PhD students. We’re very focused on research and development at the moment. But as useful applications come online, I expect things to become much more commercial in nature.

n

Would you say quantum software has been ignored in favour of all the hype and excitement of developing new qubits and processor technologies?

n

Hardware is extremely important and deserves the attention it’s been given, involving as it does some fascinating physics, materials science and engineering. But for us on the software side, it’s all about coming up with clever mathematical ideas to make algorithms more efficient and work on today’s early-stage, small-scale quantum devices. In fact, we’re more likely to make progress through better algorithms than by waiting for improvements in hardware.

n

Even if quantum hardware grew exponentially, it could be a decade before you could do anything useful with it. Working on algorithms also doesn’t require expensive cryostats, dilution refrigerators, liquid helium or chips – just a bunch of really smart people thinking deeply, which is what we have at Phasecraft. A few years ago, for example, we developed algorithms for simulating the time dynamics of quantum systems that were about six orders of magnitude better than those from Google and Microsoft.

n

Quantum processors are noisy, which means they quickly lose coherence and make calculations impossible. How do you develop practical algorithms to run on imperfect devices?

n

Noise and errors are the bane of all quantum applications on real hardware. There have been some incredible improvements to hardware, but we can’t assume quantum computers are perfect, as we can with classical devices. So with everything we do in Phasecraft, we have to think in terms of imperfect, noisy quantum computers that have errors. Run any computation and the errors build up so fast that you’re just getting noise – random data – out, and you’ve lost all of the quantum information.

n

To get round this problem, it’s critical to make algorithms as efficient as possible and make them less sensitive or susceptible to noise. It’s true that in the 1990s Peter Shor developed the concept of quantum error correction and the fault-tolerant threshold theorem, which shows, theoretically, that even on noisy quantum computers, you can run arbitrarily long quantum computation calculations. But that requires such huge numbers of qubits that we can’t count on this as a solution.

n

Three men stood and sat on stone steps in front of a large old building

n

Our focus is therefore more an engineering-type problem, where we try to understand what noise looks like in detail. The better we can understand noise, the more we can design around it so it doesn’t affect the outcome. But there’s a big payoff because if you can make an algorithm less complex, you can get something useful out of these noisy quantum computers. It’s a question of designing the algorithms so we can squeeze more out of them.

n

I often say that today’s quantum computers are where classical computers were in the 1950s. Back then, people like Alan Turing were coming up with really clever ideas of how to squeeze a bit more out of clunky primitive hardware and actually do incredible things with it. That’s the stage we’re at with quantum computing. In fact, certain algorithms are sometimes more suited for one type of hardware than another.

n

In terms of hardware, what type of qubits are you using at the moment?

n

At Phasecraft we’re interested in all types of hardware. Predominantly, though, we’re using superconducting qubit circuits, because that’s the current leading hardware platform. But we’re running ion traps on cold-atom hardware too and we’re also thinking about photonic hardware. But we’re not tied to one particular platform.

n

Phasecraft’s focus is on algorithms that calculate material properties. Why are those applications so suitable for today’s early quantum computers?

n

In industry, many companies spend a lot of time and money using classical, high-performance computers to work out the properties of materials. The trouble is, it’s very computationally intensive so they end up trying to simplify the problem. But the danger then is you can get things completely wrong. For example, you may end up predicting a material is an insulator when in fact it’s a conductor. It can be that level of wrong sometimes.

nn

At Phasecraft, we’re focusing on modelling and simulating materials because those applications are within closest reach of current hardware. Other applications, such as optimization, are more demanding in terms of the number of qubits and gates you need. As hardware improves, quantum chemistry simulations will become within our reach. They’re harder to simulate than periodic, crystalline materials because the complexity of an algorithm in molecular systems scales as number of electron orbitals to the power of four.

n

Can you give us a taste of some specific materials you’ve looked at?

n

At the moment, the hardware is not yet large enough to be able to do simulations of real materials beyond what can be done classically. So we’re still at the stage where we have the algorithms, but we don’t yet quite have the hardware to run on, although it’s getting close. Having said that, the types of materials that are good targets for early-stage applications of quantum computing are clean-energy-related – battery materials, things like metal oxides.

n

They also happen to be ones where classical algorithms don’t work very well, because they involve strongly correlated electrons. The same goes for photovoltaics. In fact, we have a collaboration with Oxford PV, which is working with perovskite photovoltaics, where we’re again looking at strongly correlated electron systems. This involves dynamically simulating things like the rate at which particle-hole pairs recombine to emit light.

n

We’ve also examined strontium vanadate, which happens to have a nice band structure that means that it can fit on a smaller quantum computer than certain other materials. It’s not the smallest, but it’s a metal-oxide system that’s of interest and needs fewer qubits and fewer gates than other metal oxides.

n

When do you think Phasecraft will reach the point of “quantum advantage” where your algorithms can run on a quantum processor and can calculate things a supercomputer can’t?

n

That’s the million-dollar question. In fact, it’s probably the billion-dollar question. The quantum industry needs to get to that point where it’s not just demonstrating toy problems but solving real-world problems on quantum computers.

n

I hope I don’t sound like the guy who supposedly once said there’d only ever be a need for three computers in the world, but I genuinely think we might get there in the next two to three years. Those early questions may be of scientific interest rather than industrial interest – industry might be a little beyond that point. It’s not going to be a case of switching off your high-performance computing (HPC) clusters overnight and moving straight over to a quantum computer. It’s much more likely to be a gradual process whereby more and more useful things will come online. It’s how science works: you make progress, you hit an obstacle and then make more progress. It tends to ratchet up.

n

Progress depends on lots of hard work by large teams of scientists working diligently for many years. That’s what’s going on in quantum computing, and the first applications might not hit the headlines

n

n

When the wider media report on quantum computers, they tend to assume massive breakthroughs emerge out of the blue from nowhere. But they don’t. Progress depends on lots of hard work by large teams of scientists working diligently for many years. That’s what’s going on in quantum computing, and the first applications might not hit the headlines. But scientists will realize when we’ve passed that threshold where you can do things that are impossible with conventional computers. We’re not far off.

n

Phasecraft recently received £13m in private funding. What do you plan to do with that cash?

n

For a quantum algorithm company like ours, the vast majority of funding goes on paying people’s salaries. Our staff are the key – our most valuable asset is our team. For a hardware company it’s very different, because hardware is expensive. But we need people to think and code so that money will let us steadily expand our team.

n

We’ve always got more ideas than we have the resources to pursue and, as we get closer to implementing large computations on quantum computers, we’ll be scaling up the team. It’s still a few years before we will have commercially relevant applications, but when that happens, we’ll go through an inflection point and the whole industry will change. We are always keen to talk to smart people who are excited about using quantum mechanics for real-world applications.

n

So how will the firm evolve?

n

All it takes is one amazing, outstanding idea that could completely change the whole quantum industry. We’re keen on making sure we give our research team the space to do that kind of blue-sky thinking that could change the face of where the company goes. Sure, not all ideas will work – 20 might fail but the 21st will turn out to be a significant new direction that no-one else thought of. That’s happened a couple of times at Phasecraft already. Someone gets inspired, and then a new direction opens up.

nn

We’re at a hugely exciting time in quantum computing. I’m still a professor at UCL, and I still have an academic group there, but I find both sides – applied and theoretical – equally intellectually interesting. I’ve theorized about some topics for 20 years but haven’t had any tools to put them into practice. Now, though, I can take that theory and make it real. Instead of just writing a paper, I can run my idea on hardware.

n

Sure, it might not work at all. It could turn out that the real universe says: “No. That’s not a good idea.” But it could still be an incredibly useful and fascinating problem to tackle. And so the applied side of the research – applying this physics to the technology – I find just as fascinating and interesting as the blue-sky academic thinking.

n

The post Toby Cubitt: why algorithms will speed up applications of quantum computers appeared first on Physics World.

n

]]>
https://hadamard.com/c/toby-cubitt-why-algorithms-will-speed-up-applications-of-quantum-computers/feed/ 0 67
Top 10 Breakthroughs of 2023: we explore this year’s best physics research https://hadamard.com/c/top-10-breakthroughs-of-2023-we-explore-this-years-best-physics-research/ https://hadamard.com/c/top-10-breakthroughs-of-2023-we-explore-this-years-best-physics-research/#respond Thu, 07 Dec 2023 15:08:52 +0000 https://physicsworld.com/?p=111715 Continue reading Top 10 Breakthroughs of 2023: we explore this year’s best physics research]]> n

This episode of the Physics World Weekly podcast features a lively discussion about our Top 10 Breakthroughs of 2023. Physics World editors discuss the merits of research on a broad range of topics including particle physics, quantum technology, medical physics and astronomy.

n

The top 10 serves as the shortlist for the Physics World Breakthrough of the Year award, the winner of which will be announced on 14 December.

n

Links to all the nominees, more about their research and the criteria for the award can be found here.

n

n

Reports on Progress in Physics logo

n

Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.

n

 

n

 

n

n

n

The post Top 10 Breakthroughs of 2023: we explore this year’s best physics research appeared first on Physics World.

n

]]>
https://hadamard.com/c/top-10-breakthroughs-of-2023-we-explore-this-years-best-physics-research/feed/ 0 74
Physics World reveals its top 10 Breakthroughs of the Year for 2023 https://hadamard.com/c/physics-world-reveals-its-top-10-breakthroughs-of-the-year-for-2023/ https://hadamard.com/c/physics-world-reveals-its-top-10-breakthroughs-of-the-year-for-2023/#respond Thu, 07 Dec 2023 15:00:50 +0000 https://physicsworld.com/?p=111709 Continue reading Physics World reveals its top 10 Breakthroughs of the Year for 2023]]> PW Top Ten iconPhysics World is delighted to announce its top 10 Breakthroughs of the Year for 2023, which ranges from research in astronomy and medical physics to quantum science, atomic physics and more. The overall Physics World Breakthrough of the Year will be revealed on Thursday 14 December.

n

The 10 Breakthroughs were selected by a panel of Physics World editors, who sifted through hundreds of research updates published on the website this year across all fields of physics. In addition to having been reported in Physics World in 2023, selections must meet the following criteria:

n

    n

  • Significant advance in knowledge or understanding
  • n

  • Importance of work for scientific progress and/or development of real-world applications
  • n

  • Of general interest to Physics World readers
  • n

n

The Top 10 Breakthroughs for 2023 are listed below in chronological order of when they were reported in Physics World. Come back next week to find out which one has won the overall Physics World Breakthrough of the Year award.

n

Growing electrodes inside living tissue

n

Injectable gel for creating electrodes

n

To Xenofon Strakosas, Hanne Biesmans, Magnus Berggren and colleagues at Linköping University, Lund University and the University of Gothenburg for developing a way to create electronic circuits directly inside living tissue. Interfacing neural tissue with electronics provides a way to study the complex electrical signalling of the nervous system or modulate neural circuitry to treat disease. However, the mismatch between rigid electronics and soft tissues risks damaging delicate living systems. Instead, the team used an injectable gel to create soft electrodes directly within the body. After injection into living tissue, enzymes in the gel break down endogenous metabolites in the body, which trigger enzymatic polymerization of organic monomers in the gel, converting them into stable, soft conducting electrodes. The researchers validated the process by injecting gels into zebrafish and medicinal leeches, where the gel polymerized and grew electrodes within the tissue. 

n

Neutrinos probe the proton’s structure

n

To Tejin Cai at the University of Rochester in the US and Canada’s York University, and colleagues working on Fermilab’s MINERvA experiment for showing how information about the internal structure of the proton can be gleaned from neutrinos scattering from a plastic target. Neutrinos are subatomic particles that are famous for rarely interacting with matter. So, there were doubts when Cai, a postdoctoral researcher, suggested that the occasional scattering of neutrinos from protons in plastic could be observed. The big challenge for the team was observing the signal from neutrinos scattered from lone protons (hydrogen nuclei) within the much larger background of neutrinos scattered off protons bound-up in carbon nuclei. To solve this problem, they simulated the carbon-scattered signal and carefully subtracted it from the experimental data. As well as providing insights into the structure of the proton, the technique could also shed further light on how neutrinos interact with matter.

n

Simulating an expanding universe in a BEC 

n

To Celia Viermann and Markus Oberthaler of the University of Heidelberg, Germany, together with Stefan Floerchinger of the University of Jena, Germany, and colleagues at the Universidad Complutense de Madrid, Spain, Ruhr-Universität Bochum, Germany and the Université libre de Bruxelles, Belgium, for using a Bose–Einstein condensate (BEC) to simulate an expanding universe and the quantum fields within it. In this simulated system, the condensate represented the universe, while phonons moving through it played the role of the quantum fields. By changing the scattering length of the atoms in the BEC, the team made the “universe” expand at different rates and studied how the phonons seeded density fluctuations within it. Theories of cosmology predict that similar effects were responsible for seeding large-scale structure in the early universe, so the simulated universe may produce valuable insights into how the real one came to be the way it is today.

n

A double slit in time   

n

To Romain Tirole and Riccardo Sapienza at Imperial College London and colleagues for the demonstration of Young’s double-slit interference in time. The 19th-century observation of the interference of light waves by Thomas Young is one of the most iconic experiments in the history of physics and provided fundamental support to the wave theory of light. While that experiment and others like it involve diffraction of light through a pair of narrow slits in space, researchers in the UK and elsewhere showed it is possible to achieve the equivalent effect using double slits in time. The temporal analogue involves fixed momentum but changing frequency. A material in which two slits rapidly appear and then disappear, one after the other, should cause incoming waves to maintain their path in space but spread out in frequency. The researchers achieved this by turning the reflectivity of a semiconductor mirror on and off twice in quick succession and recording interference fringes along the frequency spectrum of light bounced off the mirror. They saw that the interference happens between waves at different frequencies – rather than different spatial positions. The work could have several applications such as optical switches for signal processing and communication or in optical computing. 

n

Digital bridge enables natural walking after spinal cord injury

n

Walking with a digital bridge between the brain and spinal cord

n

To Grégoire Courtine at Ecole Polytechnique Fédérale de Lausanne (EPFL), Jocelyne Bloch at Lausanne University Hospital and EPFL, Guillaume Charvet at CEA-Leti’s Clinatec, and colleagues for developing a “digital bridge” between the brain and spinal cord that enabled an individual with paralysis to stand and walk naturally. Spinal cord injury can disconnect communication between the brain and the region of the spinal cord that produces walking, which can lead to permanent paralysis. To restore this communication, the team developed a brain–spine interface, comprising two implantable systems: one to record cortical activity and decode the user’s intention to move the lower limbs; and the other to electrically stimulate the region of the spinal cord that controls leg movement. The team tested the system in a 38-year-old man with a spinal cord injury from a bike accident 10 years earlier. Following implant surgery, the bridge enabled the participant to regain intuitive control over his leg movements, enabling him to stand, walk, climb stairs and traverse complex terrains. 

n

Building blocks for a large-scale quantum network 

n

To Ben Lanyon and colleagues at the University of Innsbruck, Austria, and the University of Paris-Saclay, France, for constructing a quantum repeater and using it to transfer quantum information over a distance of 50 km via standard telecommunications fibres, thereby demonstrating all the key functionalities of a long-distance quantum network in a single system. The team created its quantum repeater from a pair of trapped calcium-40 ions that emit photons after being illuminated with a laser pulse. These photons, each of which is entangled with its “parent” ion, are then converted to telecoms wavelengths and sent down separate 25-km-long optical fibres. Finally, the repeater swaps the entanglement on the two ions, leaving two entangled photons 50 km apart – roughly the distance required to create large-scale networks with multiple nodes. 

n

First X-ray image of a single atom

n

Saw Wai Hla, Volker Rose at Argonne National Laboratory in the US and colleagues for imaging a single atom with synchrotron X-rays.  Until recently, the smallest sample size that could be analysed using synchrotron X-ray scanning tunnelling microscopy was an attogram, which is around 10,000 atoms. This is because the X-ray signal produced by a single atom is extremely weak and conventional detectors are not sensitive enough to detect it. To get around this, the team added a sharp metallic tip to a conventional X-ray detector, which is placed just 1 nm above the sample to be studied. As the sharp tip is moved across the surface of a sample, electrons tunnel through the space between the tip and the sample, creating a current and this essentially detects “fingerprints” that are unique to each element. This allowed the team to combine the ultrahigh-spatial resolution of scanning tunnelling microscopy with the chemical sensitivity provided by intense X-ray illumination. The technique could lead to applications in material design as well as in environmental science through the ability to trace toxic materials down to extremely low levels.  

n

“Smoking gun” evidence of early galaxies transforming the universe

n

To the EIGER Collaboration for using the James Webb Space Telescope (JWST) to find compelling evidence that early galaxies were responsible for the reionization of the early universe. Reionization occurred about 1 billion years after the Big Bang and involved the ionization of hydrogen gas. This allowed light that would have been absorbed by hydrogen to travel to the telescopes of today. Reionization appears to have begun as local bubbles that grew and coalesced. These bubbles would have been created by sources of radiation, and one possibility is that it came from stars in galaxies. The EIGER researchers used the JWST’s Near Infrared Camera to look at light from ancient quasars that had passed through the ionized bubbles. They found a correlation between the locations of galaxies and the bubbles, suggesting that light from these early galaxies was indeed responsible for reionization.

n

Supersonic cracks in materials

n

To Meng Wang, Songlin Shi and Jay Fineberg of the Hebrew University of Jerusalem, Israel, for discovering that cracks in certain materials can spread faster than the speed of sound. The result contradicts both previous experimental results and predictions based on classical theory, which state that supersonic crack propagation should not be possible because the speed of sound in a material reflects how quickly mechanical energy can move through it. The team’s observations may indicate the presence of so-called “supershear” dynamics governed by different principles than those that guide classical cracks, as predicted by Michael Marder of the University of Texas at Austin, US nearly 20 years earlier.

n

Antimatter does not fall up

n

Barrel scintillator

n

To the ALPHA Collaboration for showing that antimatter responds to gravity in much the same way as matter. The physicists used the ALPHA-g experiment at CERN to make the first direct observation of free-falling antimatter atoms – antihydrogen that comprises an antiproton bound to an antielectron. This was done in a tall cylindrical vacuum chamber in which antihydrogen was first held in a magnetic trap. The antihydrogen was released from the trap and allowed to annihilate at the walls of the chamber. The team found that more annihilations occurred below the release point than above it. After considering the thermal motion of the antihydrogen, the team concluded that antimatter falls down. Tantalizingly, the antihydrogen’s acceleration due to gravity was about 75% of that experienced by normal matter. Although this measurement has a low statistical significance, it leaves the door open to new physics beyond the Standard Model.

n

Honourable mention

n

Fusion energy breakthrough 

n

An honourable mention in our top 10 for this year goes to physicists working at the $3.5bn National Ignition Facility (NIF) in the US for work that was performed at the lab late last year after we picked our 2022 winners (and so misses out on our 2023 breakthrough choice too). On 13 December 2022 the lab announced the generation of more energy from a controlled nuclear fusion reaction than was needed to power the reaction. The laser shot, performed on 5 December 2022, released 3.15 million joules (MJ) of energy from a tiny pellet containing two hydrogen isotopes – compared to the 2.05 MJ that those lasers delivered to the target. This demonstration of net energy gain marks a major milestone in laser fusion.

n

n

n

Reports on Progress in Physics logo

n

Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.

n

 

n

 

n

n

n

The post <em>Physics World</em> reveals its top 10 Breakthroughs of the Year for 2023 appeared first on Physics World.

n

]]>
https://hadamard.com/c/physics-world-reveals-its-top-10-breakthroughs-of-the-year-for-2023/feed/ 0 75
Laser light goes for a quantum walk in a microchip https://hadamard.com/c/laser-light-goes-for-a-quantum-walk-in-a-microchip/ https://hadamard.com/c/laser-light-goes-for-a-quantum-walk-in-a-microchip/#respond Wed, 06 Dec 2023 13:00:37 +0000 https://physicsworld.com/?p=111669 Continue reading Laser light goes for a quantum walk in a microchip]]> Researchers at ETH Zürich in Switzerland have transformed a microchip laser that emits a single frequency (or colour) of light into one that emits light over a broad range of frequencies. The new optical comb device, which works thanks to a process known as a quantum walk, could be used to make miniaturized optical sensors for environmental and medical monitoring and to increase data transmission rates in telecommunications.

n

Led by physicist Jérôme Faist, the ETH researchers began with a quantum cascade laser integrated into a microchip. This device consists of a micro-ring structure made up of layers of arsenide, gallium, indium and aluminium. The ring confines and guides light and when connected to a direct source of electrical current, the electrons in it are stimulated to quickly jump across the different layers, emitting a cascade of photons. As the photons circulate in the ring, they multiply, producing coherent laser light with a single frequency.

nn

Faist and colleagues found that if they excite this system with an additional alternating current oscillating at a certain resonance frequency, the light emitted goes from being a single colour to multiple colours in a space of just a few nanoseconds. Notably, before it stabilizes its final form, the spectrum of the emitted light resembles the motion of a so-called quantum walk.

n

A laser’s quantum walk

n

First proposed by the physicist and Nobel laureate Richard Feynman, the quantum walk is very different from the classical random walk commonly used to model the behaviour of physical systems ranging from fluctuating stock markets to the Brownian motion of pollen grains on the surface of a liquid. The classical random walk works like a lost hiker who chooses their next steps according to the toss of a coin. If the coin lands on heads, for example, the hiker might take a step to the left, whereas tails might call for a step to the right. After many coin tosses, the hiker’s position will be random, but likely close to their starting point.

n

In a quantum walk, in contrast, a quantum particle effectively moves in both directions at the same time after every toss, adopting a coherent superposition of right and left. This means there are always several possible paths the particle can take to arrive at its final position.

n

An optical comb-like spectrum

n

In the new device, this quantum walk has a remarkable outcome. “The different colours (or frequencies) add energy to the light emitted and create an optical comb-like spectrum,” Faist explains. “The optical frequencies are equidistant from each other, and their number is selected by the frequency and amplitude of the electrical oscillating signal sent to the laser.”

nn

As for applications, the researchers say miniaturized optical sensors for environmental and medical monitoring are a possibility. In the longer term, Faist adds that such devices could increase the data transmission rate for optical communications, since each colour of light the laser emits – up to 100 colours in total – could serve as an independent communication channel.

n

The researchers report their findings in Science.

n

The post Laser light goes for a quantum walk in a microchip appeared first on Physics World.

n

]]>
https://hadamard.com/c/laser-light-goes-for-a-quantum-walk-in-a-microchip/feed/ 0 78
Charge qubits get a thousand-fold boost https://hadamard.com/c/charge-qubits-get-a-thousand-fold-boost/ https://hadamard.com/c/charge-qubits-get-a-thousand-fold-boost/#respond Mon, 04 Dec 2023 11:11:51 +0000 https://physicsworld.com/?p=111642 Continue reading Charge qubits get a thousand-fold boost]]> Researchers in the US have improved the coherence time of charge quantum bits (qubits) by a factor of 1000 thanks to advances in the materials used to construct them. Led by Dafei Jin of the Argonne Center for Nanoscale Materials and David Schuster of Stanford University and the University of Chicago, the multi-institutional team also showed it was possible to read out the state of these qubits with a fidelity of 98.1% – a value Jin says will increase further with the aid of more sophisticated readout technologies.

n

Coherence time is vitally important within quantum computing, as it denotes how long a qubit can remain in a superposition of multiple states before environmental noise causes it to decohere, or lose its quantum nature. During this period, a quantum computer can perform complex computations that classical computers cannot.

nn

Many quantum systems can act as qubits. Spin qubits, for example, encode quantum information in the spin of an electron or nucleus, which can be up, down or a superposition of the two. Charge qubits, for their part, represent quantum information through the presence or absence of excess charge on an electron contained within the qubit system. They are relatively new – members of the team created the first in 2022 – and Jin says they have several advantages over spin qubits.

n

“Charge qubits typically permit much faster operation speed because charges couple strongly with electric fields,” he explains. “This is advantageous over spin qubits because spins couple weakly with magnetic fields. Charge qubit devices are generally much easier to fabricate and operate, because most existing fabrication and operation infrastructures are based on charges and electric fields, rather than spins and magnetic fields. They can often be made more compact.”

n

Ultraclean is ultraquiet

n

Jin explains that the researchers created their charge qubits by trapping an electron within a quantum dot, which is a nanoscale collection of atoms that behaves like a single quantum particle. The quantum dot rests on a surface made from solid neon and is placed in a vacuum.

n

According to Jin, this ultraclean environment is key to the experiment’s success. Neon, as a noble gas, will not form chemical bonds with other elements. In fact, as the team point out in a Nature Physics paper on the research, neon in a low-temperature and near-vacuum environment will condense into an ultrapure semi-quantum solid devoid of anything that could introduce noise into the qubit. This lack of noise enabled the team to boost the coherence time of the charge qubit from the 100 nanoseconds typical of previous efforts to 100 microseconds.

n

What is more, the researchers read out the state of these qubits with 98.1% fidelity without using a quantum-limited amplifier, which Jin describes as “a special device placed at very low temperature (in our case 10 millikelvin) that can amplify weak electromagnetic signals but bring in nearly zero thermal noise”. Because such devices enhance readout ability, obtaining 98.1% fidelity without them is, Jin says, especially impressive. “In our future experiments, once we use them, our readout fidelity can only go much higher,” he adds.

n

The next milestone

n

While a thousand-fold increase in coherence time is already a major improvement over previous charge qubit systems, the researchers expect even more in the future. According to Jin, the team’s theoretical calculations suggests that the charge qubit system could reach a coherence time of 1–10 milliseconds, representing another factor of 10–100 improvement over current values. To realize this, though, scientists will need to gain better control over every aspect of the experiment, from device design and fabrication to qubit control.

nn

Beyond that, Jin and colleagues continue to look for ways to improve the system even further.

n

“The biggest milestone next is to show two charge qubits can be entangled together,” Jin says. “We have been working on that and have had a lot of progress. Once we accomplish that, our qubit platform is then ready for universal quantum computing, even though some detailed performance can keep being improved.”

n

The post Charge qubits get a thousand-fold boost appeared first on Physics World.

n

]]>
https://hadamard.com/c/charge-qubits-get-a-thousand-fold-boost/feed/ 0 86
Why Alice & Bob are making cat qubits, IOP calls for action on net-zero target https://hadamard.com/c/why-alice-bob-are-making-cat-qubits-iop-calls-for-action-on-net-zero-target/ https://hadamard.com/c/why-alice-bob-are-making-cat-qubits-iop-calls-for-action-on-net-zero-target/#respond Thu, 30 Nov 2023 16:45:41 +0000 https://physicsworld.com/?p=111616 Continue reading Why Alice & Bob are making cat qubits, IOP calls for action on net-zero target]]> n

This episode of the Physics World Weekly podcast looks at two very different and very difficult challenges — how to build a quantum computer that can overcome the debilitating noise that plagues current processors; and how to ensure that the UK meets its target for net-zero greenhouse gas emissions by 2050.

nn

Our first guest is the nuclear physicist and sustainable energy expert, Martin Freer, who coordinated the writing of a report from the Institute of Physics (IOP) called Physics Powering the Green Economy. Freer, who is at the University of Birmingham, explains why more investment and support will be needed to ensure that the UK meets its target to achieve net-zero greenhouse gas emission by 2050.

n

Meanwhile in Paris, the quantum-computer maker Alice & Bob is developing “cat qubits” that promise to reduce the amount of hardware required to do quantum error correction. The company’s co-founder and CEO Théau Peronnin explains how the technology works and how it could be used to build quantum computers that could solve practical problems. He also explains why the company chose its quirky name.

n

The post Why Alice & Bob are making cat qubits, IOP calls for action on net-zero target appeared first on Physics World.

n

]]>
https://hadamard.com/c/why-alice-bob-are-making-cat-qubits-iop-calls-for-action-on-net-zero-target/feed/ 0 90
Partnerships push for quantum advantage https://hadamard.com/c/partnerships-push-for-quantum-advantage/ https://hadamard.com/c/partnerships-push-for-quantum-advantage/#respond Thu, 30 Nov 2023 11:20:44 +0000 https://physicsworld.com/?p=111578 Continue reading Partnerships push for quantum advantage]]> One of the many factors that have contributed to the success of the UK’s National Quantum Technologies Programme (NQTP) has been its emphasis on the power of collaboration. At its inception in 2014 the NQTP established four research hubs that enabled academic groups from across the UK to share knowledge and resources, accelerating the development of novel quantum technologies and catalysing the development of a thriving start-up sector.

n

That culture of collaboration has been fully embraced at the National Quantum Computing Centre (NQCC), launched in 2020 as a flagship initiative of the NQTP. With a diverse programme of activities spanning technology development, infrastructure provision and end-user engagement, the national lab has been forging strategic links with both industrial and academic partners to deliver its headline objective of demonstrating quantum advantage – the point at which a quantum computer can solve a specific problem faster than its classical counterpart – by 2025.

n

In one notable development, announced in November 2023, an agreement with IBM Quantum will provide the NQCC and its collaborators with cloud-based access to the computing giant’s entire fleet of quantum computers. Among them are multiple machines with 127 qubits, delivering what the company terms “utility-scale” quantum computing.

n

“Quantum developers have been using small numbers of qubits for a long time, but they also need to work with larger scale machines to demonstrate how quantum computers have become useful tools for solving classes of problems beyond brute-force classical simulation of quantum mechanics,” explained IBM’s Adam Hammond, speaking at the annual National Quantum Technologies Showcase in London at the beginning of November. “Our focus at IBM is to bring forward the day when quantum computers can do work that is difficult to achieve with classical computers, and eventually to deliver a clear advantage in use cases that benefit industry and advance science.”

n

While access to the quantum hardware will be provided as a commercial service, the agreement will also enable the NQCC to join IBM’s Quantum Network, which brings together around 250 organizations from around the world. “We encourage collaborations between members of the network and with us, looking at things like algorithm development and the application of quantum computing to practical use cases where we think we can reach quantum advantage first,” explained Hammond. “We are just at the beginning of our partnership with the NQCC, and we are keen to explore how we can grow that partnership by working together and sharing our knowledge and experience.”

n

Adam Hammond of IBM Quantum in conversation with Simon Plant from NQCC

n

This latest collaboration builds on an existing technology agreement with Oxford Quantum Circuits, which provides access to its Lucy quantum processor for projects funded through the NQCC’s user engagement programme, called SparQ. At the same time the NQCC is establishing a capability in quantum emulation, a technique that enables a classical computer to run quantum circuits in the same way as a quantum machine. In this case the NQCC is leveraging a powerful software program created at the University of Oxford, called the Quantum Exact Simulation Toolkit (QuEST), and is currently working with one of the original developers to optimize its capabilities for use on the high-performance compute cluster at the Harwell Campus.

n

Meanwhile, a more established partnership with the National Physical Laboratory (NPL) extends from standards development through to technology collaboration. In one new initiative, also announced at the beginning of November, the two national facilities will be cornerstone members of a pilot network for quantum standards in the UK. “The NQCC will bring their skills and knowledge in quantum computing, as well as their connections with the start-up community, ” explains John Devaney, the NPL’s quantum standards manager. “We can share our expertise in standards development, while also keeping track of international initiatives that might influence our approach here in the UK.”

n

The Quantum Standards Network Pilot will also include representatives from government bodies, the BSI standards organization and UKQuantum, an industry association that aims to provide a single voice for companies developing quantum technologies. The aim of the pilot is to kickstart a conversation around standards and accreditation for emerging quantum platforms, find the best model for effective collaboration between key stakeholders in the UK, and make the case for a longer term focus on developing standards that support the growth of the UK’s quantum ecosystem.

n

“Standards are all about opening up markets and promoting innovation, but there is also a risk of standardizing too early or too narrowly,” says Devaney. “The pilot network will provide a forum for people to share information and discuss ideas, plus we will run workshops focused on particular industry sectors so that quantum developers can talk to potential customers about the assurances they will need to be confident that the technology will work for them.”

n

Devaney points out that the first step towards standards development is to establish proper test and evaluation protocols for diverse quantum technologies, something that NPL has been working on for many years.  “We already collaborate with start-up companies and research organizations to provide independent measurements and characterization of their prototypes,” he says. “By working with the NQCC we will be able to translate those test and measurement procedures into standards that are appropriate for the UK’s quantum computing community.”

n

NPL’s expertise in quantum test and measurement will also help the NQCC’s technical teams as they establish experimental platforms based on ion traps, superconducting circuits and, in the most recent addition, neutral atoms. Beyond the characterization of these emerging computing platforms, NPL has also been developing similar quantum technologies for applications in metrology, including ion-trap devices that were originally developed for time and frequency measurements.

n

The two organizations have now secured funding to transfer an ion-trap chip originally developed at NPL to the NQCC for further development and scale-up. “This device could be really useful for quantum computing, but we do not have the resources or the infrastructure to take the technology forward,” says Guido Wilpers, a senior scientist at NPL. “The NQCC is the best place to explore and expand its capabilities for quantum computing, while we will be able to share our knowledge of the device and collaborate on future test and measurement strategies.”

n

Indeed, the first ion-trap platform to be built at the NQCC exploits a device that emerged from research at the University of Oxford, showing how pioneering work within the UK’s academic sector can enable the NQCC’s technical teams to focus on scaling up the technology rather than designing and building new devices from scratch. Over the last year quantum researchers at the NQCC have also been visiting leading academic groups in the UK to understand current best practice, make informed experimental design decisions, and explore the possibilities for future collaborations – with the ion-trap team already involved in several joint projects with academic and industrial partners.

n

Meanwhile, the NQCC’s new partnership with the Quantum Software Lab (QSL) at the University of Edinburgh is already opening up exciting new opportunities for engaging with industry on applications development. The first project in the pipeline, announced in early November, will see QSL and NQCC researchers work with high-street bank HSBC and technology provider Rigetti to develop quantum machine-learning approaches to tackle the growing problem of money laundering.

n

This collaboration aligns with the NQCC’s ambitions of exploring impactful early applications of quantum computing, showcasing the potential of the technology, stimulating user adoption, and shaping the UK’s quantum computing user community

nElham Kashefi, director of the QSL and the NQCC’s chief scientist

n

HSBC already exploits classical machine learning to detect anomalies in financial transactions that could indicate criminal behaviour, and believes that a quantum-enabled solution could reduce risk and improve its anti-fraud services. “The growing rates of financial crime globally means it is imperative that we find an enhanced way to stop people becoming a victim of fraudsters,” commented Martin Brown, a specialist in fraud analytics at HSBC. “Quantum computing has the potential to be a game changer in this arena.”

n

As part of the project, QSL will take the lead on the development of quantum machine-learning algorithms, which will then be run on Rigetti’s 24-qubit platform. But the project partners are already working closely together to develop an optimal and practical solution. “We have project meetings every two weeks to ensure that the algorithms we are developing make best use of the capabilities of the hardware,” explains Ross Grassie, the QSL’s technical programme manager. “HSBC is also really keen to to learn more about quantum machine-learning approaches, and having their involvement right from the start will help us to create a practical solution that meets their needs.”

n

For Elham Kashefi, director of the QSL and the NQCC’s chief scientist, the project shows how industry and academia can work together to accelerate innovation and develop quantum-enabled solutions that meet a clear commercial need. “This is one of the very first projects of the QSL and we are eager to use it as a working model for other projects to come,” she says. “This collaboration aligns with the NQCC’s ambitions of exploring impactful early applications of quantum computing, showcasing the potential of the technology, stimulating user adoption, and shaping the UK’s quantum computing user community.”

n

Elham Kashefi

n

 

n

The post Partnerships push for quantum advantage appeared first on Physics World.

n

]]>
https://hadamard.com/c/partnerships-push-for-quantum-advantage/feed/ 0 92
Everything, everywhere all at once https://hadamard.com/c/everything-everywhere-all-at-once/ https://hadamard.com/c/everything-everywhere-all-at-once/#respond Wed, 29 Nov 2023 16:00:00 +0000 https://news.mit.edu/2023/everything-everywhere-all-once-1129 Continue reading Everything, everywhere all at once]]> <p>The way Morgane König sees it, questioning how we came to be in the universe is one of the most fundamental parts of being human.</p>nn<p>When she was 12 years old, König decided the place to find answers was in physics. A family friend was a physicist, and she attributed her interest in the field to him. But it wasn’t until a trip back to her mother’s home country of Côte d’Ivoire that König learned her penchant for the subject had started much younger. No one in Côte d’Ivoire was surprised she was pursuing physics — they told her she’d been peering upward at the stars since she was a small child, wondering how they all had come together. ­</p>nn<p>That wonder never left her. “Everyone looks at the stars. Everyone looks at the moon. Everybody wonders about the universe,” says König. “I’m trying to understand it with math.”</p>nn<p>König’s observations have led her to MIT, where in 2021 she continued studying theoretical cosmology as a postdoc with physicist and cosmologist Alan Guth and physicist and historian of science David Kaiser. Now, she is a member of MIT’s 2023-24 Martin Luther King (MLK) Visiting Professors and Scholars Program cohort, alongside 11 others. This year, members of the MLK Scholars are researching and teaching diverse subjects including documentary filmmaking, behavioral economics, and writing children’s books.</p>nn<p>Once she was set on physics, König finished her undergraduate studies in 2012, double-majoring in mathematics and physics at Pierre and Marie Curie University in Paris.</p>nn<p>Still compelled by questions about the universe, König narrowed in on cosmology, and graduated with her master’s degree from Pierre and Marie Curie in 2014. The way König describes it, cosmology is like archaeology, just up in space. While astronomers study galaxy formations and mutations — all of the stuff<em> </em>in the universe — cosmologists study everything about the universe, all at once.</p>nn<p>“It’s a different scale, a different system,” says König. “Of course, you need to understand stars, galaxies, and how they work, but cosmologists study the universe and its origin and contents as a whole.”</p>nn<p><strong>From practice to theory</strong></p>nn<p>Throughout her studies, König said, she was often the only woman in the room. She wanted to pursue the theories behind cosmology but wasn’t encouraged to try. “You have to understand that being a woman in this field is super, incredibly difficult,” says König. “I told everyone I wanted to do theory, and they didn’t believe in me. So many people told me not to do it.”</p>nn<p>When König had the opportunity to pursue a PhD in observational cosmology in Marseille and Paris, she almost accepted. But she was more drawn to theory. When she was offered a spot with a little more freedom to study cosmology at the University of California at Davis, she took it. Alongside Professor Nemanja Kaloper, König dove into inflation theory, looking all the way back to the universe’s beginning.</p>nn<p>It is well-known that the universe is always expanding. Think about inflation as the precursor to that expansion — a quick and dramatic beginning, where the universe grew exponentially fast.</p>nn<p>“Inflation is the moment in history that happened right after the beginning of the universe,” says König. “We’re not talking about 1 second, not even a millisecond. We are talking 10 to the negative 32nd seconds.” In other words, it took 0.000,000,000,000,000,000,000,000,000,000,01 seconds for the universe to go from something minuscule to, well, everything. And today, the universe is only getting bigger.</p>nn<p>Only a sliver of the universe’s composition is understandable using current technology — less than 5 percent of the universe is composed of matter we can see. Everything else is dark matter and dark energy.</p>nn<p>For decades, cosmologists have been trying to excavate the universe’s mysterious past using photons, the tiny, particle form of light. Since light travels at a fixed speed, light emitted further back in the universe’s history, from objects that are now farther away from us due to the expansion of the universe, takes longer to reach Earth. “If you look at the sun — don’t do it! — but if you did, you’d actually be seeing it eight minutes in the past,” says König. As they carve their way through the universe, photons give cosmologists historical information, acting as messengers across time. But photons can only account for the luminous 4.9 percent of the universe. Everything else is dark.</p>nn<p>Since dark matter doesn’t emit or reflect photons like luminous matter, researchers can’t see it. König likens dark matter to an invisible person wearing a tuxedo. She knows something is there because the tuxedo is dancing, swinging its arms and legs around. But she can’t see or study the person inside the suit using the technology at hand. Dark matter has stirred up countless theories, and König is interested in the methods behind those theories. She is asking: How do you study something dark when light particles are necessary for gathering historical information?</p>nn<p>According to König and her MIT collaborators — Guth, the forerunner of inflation theory, and Kaiser, the Germeshausen Professor of the History of Science — the answer might lie in gravitational waves. König is using her time at MIT to see if she can sidestep light particles entirely by using the ripples in spacetime called gravitational waves. These waves are caused by the collision of massive, dense stellar objects such as neutron stars. Gravitational waves also transmit information across the universe, in essence giving us a new sense, like hearing is to seeing. With data from instruments such as the Laser Interferometer Gravitational Wave Observatory (LIGO) and NANOGrav, “not only can we look at it, now we can hear the cosmos, too,” she says.</p>nn<p><strong>Black in physics</strong></p>nn<p>Last year, König worked on two all-Black research teams with physicists Marcell Howard and Tatsuya Daniel. “We did great work together,” König says, but she points out how their small group was one of the largest all-Black theoretical physics research teams — ever. She emphasizes how they cultivated creativity and mentorship while doing highly technical science, producing two published papers (<a href=”https://physics.paperswithcode.com/paper/elastic-scattering-of-cosmological#:~:text=We%20show%20that%20elastic%20scattering,objects%20induce%20much%20smaller%20distortions.”>Elastic Scattering of Cosmological Gravitational Wave Backgrounds</a> and <a href=”https://arxiv.org/pdf/2308.00111.pdf#:~:text=Recall%20that%20the%20SZ%20effect,in%20CGWBs%27%20energy%20density%20spectrum.”>An SZ-Like Effect on Cosmological Gravitational Wave Backgrounds</a>).</p>nn<p>Out of the 69,238 people who have earned doctorates in physics and astronomy since 1981, only 122 of them were Black women, according to the National Center for Science and Engineering Statistics. When König finished her PhD in 2021, she became the first Black student at UC Davis to receive a PhD in physics and the ninth Black woman to ever complete a doctorate in theoretical physics in the United States.</p>nn<p>This past October, in a presentation at MIT, König ended with an animated slide depicting a young Black girl sitting in a dark meadow, surrounded by warm lights and rustling grass. The girl was looking up at the stars, her eyes full of wonder. “I had to make this with AI,” says König. “I couldn’t find an image online of a young Black girl looking up at the stars. So, I made one.”</p>nn<p>In 2017, König went to Côte d’Ivoire, spending time teaching school children about physics and cosmology. “The room was full,” she says. Adults and students alike came to listen to her. Everyone wanted to learn, and everyone echoed the same questions about the universe as König did when she was younger. But, she says, “the difference between them and me is that I was given a chance to study this. I had access to people explaining how incredible and exciting physics is.”</p>nn<p>König sees a stark disconnect between physics in Africa and physics everywhere else. She wants universities around the world to make connections with African universities, building efforts to encourage students of all backgrounds to pursue the field of physics.</p>nn<p>König explains that ushering in more Black and African physicists means starting at the beginning and encouraging more undergraduates and young students to enter the field. “There is an enormous amount of talent and brilliance there,” König says. She sees an opportunity to connect with students across Africa, building the bridges needed to help everyone pursue the questions that keep them looking up at the stars.</p>nn<p>While König loves her research, she knows theoretical cosmology has far to come to as a discipline. “There is so much room to grow in the field. It’s not all figured out.”</p>

]]>
https://hadamard.com/c/everything-everywhere-all-at-once/feed/ 0 219
Ireland publishes national strategy for quantum research https://hadamard.com/c/ireland-publishes-national-strategy-for-quantum-research/ https://hadamard.com/c/ireland-publishes-national-strategy-for-quantum-research/#respond Wed, 29 Nov 2023 15:30:39 +0000 https://physicsworld.com/?p=111587 Continue reading Ireland publishes national strategy for quantum research]]> The Irish government has published a national strategy for quantum research in the country. Many of the top technology companies have operations in Ireland and the report – Quantum 2030 – A National Quantum Technologies Strategy for Ireland – describes Ireland as being ideally situated to capitalize on quantum for industry, noting the potential for quantum technologies in computing, communication, simulation and sensing.

nn

“This initiative is a brilliant step in the right direction, says quantum physicist J C Seamus Davis from University College Cork. “We need to increase training through research for scientists, engineers, mathematicians, electrical engineers and for what in future will be called quantum engineers.”

n

The report says that nine of the top ten global software companies and three of the top four internet companies have significant operations in Ireland.

n

“What we need is for some of those companies to open quantum technology research labs in Ireland and begin to recruit young Irish scientists,” says Davis.

n

Yet Ireland currently trails similarly sized countries in Europe in quantum technologies. “We have a long way to go if we want to have an Irish company building or selling quantum computers or their components,” adds Davis. “We’re not at a scale to be competitive with the Netherlands, Denmark or Finland.”

n

The post Ireland publishes national strategy for quantum research appeared first on Physics World.

n

]]>
https://hadamard.com/c/ireland-publishes-national-strategy-for-quantum-research/feed/ 0 94
Ireland set to join the CERN particle-physics lab https://hadamard.com/c/ireland-set-to-join-the-cern-particle-physics-lab/ https://hadamard.com/c/ireland-set-to-join-the-cern-particle-physics-lab/#respond Wed, 29 Nov 2023 15:00:53 +0000 https://physicsworld.com/?p=111548 Continue reading Ireland set to join the CERN particle-physics lab]]> The Irish government has finally applied to join the CERN particle-physics laboratory near Geneva as an associate member. The application will be considered at CERN’s next council session in mid-December.

n

CERN has 23 full member states with Cyprus, Estonia and Slovenia applying for that status too. Member countries pay costs towards CERN’s programmes and have representation on the CERN council. The lab currently has seven associate member countries, with Brazil on track to become the next associate member and Chile in the early stages of applying.

nn

In announcing its application to join CERN, the Irish government says that associate membership will open doors for Ireland’s researchers and technicians, making them eligible for staff positions and fellowships at CERN, as well as for training schemes. Irish companies will also have greater access to CERN procurement programmes.

n

The cost of full membership for Ireland would be around €15.9m each year, with associate membership set at a minimum of 10% of that, or €1.59m per year. Ireland may, though, recoup some of that cost through industry contracts, CERN positions and through training and education. The Irish government also approved an additional €300,000 per year for Irish researchers and teachers to participate in CERN programmes.

n

Ireland has long debated whether to join CERN, with scientists in Ireland already playing a part in CERN experiments such as LHCb, CMS and ISOLDE, the lab’s isotope mass separator facility. A turning point towards joining the lab came in 2019 when a cross-party Irish parliamentary committee recommended the move. It warned that Ireland’s attractiveness to hi-tech companies and its claims to be a “knowledge economy” could be damaged by its absence from CERN.

n

This sends a strong message about the government’s intentions to invest in fundamental science

nLewys Jones

n

Sinéad Ryan, a theoretical physicist in Trinity College Dublin and leading advocate for membership, believes that Ireland’s membership will be “transformational” for science and particle physics in the country. “The associate membership track allows you to dial up your commitment and dial it back down again as you need to,” she says, adding that previous involvement with CERN for scientists in Ireland relied on informal relationships and “the generosity of colleagues outside of Ireland”.

n

Enda McGlynn, a particle physicist at Dublin City University, who previously worked at ISOLDE, says that physicists from Ireland always had to partner with groups from member states to participate in lab activities. This, he says, made it difficult “to create a sustained research agenda that we could pursue ourselves”. McGlynn hopes that associate membership will now give fresh impetus for improved fundamental research funding in the country.

n

The wider Irish physics community has welcomed the news too. “This sends a strong message about the government’s intentions to invest in fundamental science,” says Lewys Jones, a physicist at Trinity College Dublin.  “It demonstrates a welcome shift in ambition for a small country to contribute to this big project.”

n

CERN is likely to send a task force to Ireland early next year with accession then being considered by CERN’s council in mid-2024. If all goes well, the process could be complete by the end of 2024.

n

Yet any hopes that CERN will become a fully fledged member may have to wait. A spokesperson for Ireland’s department of science, innovation and education told Physics World that the government will review membership status after five years from taking it up “to determine Ireland’s future relationship with CERN”.

n

The post Ireland set to join the CERN particle-physics lab appeared first on Physics World.

n

]]>
https://hadamard.com/c/ireland-set-to-join-the-cern-particle-physics-lab/feed/ 0 95
Flexible optical fibres deliver light to nerves for optogenetic pain inhibition https://hadamard.com/c/flexible-optical-fibres-deliver-light-to-nerves-for-optogenetic-pain-inhibition/ https://hadamard.com/c/flexible-optical-fibres-deliver-light-to-nerves-for-optogenetic-pain-inhibition/#respond Wed, 29 Nov 2023 09:45:51 +0000 https://physicsworld.com/?p=111503 Continue reading Flexible optical fibres deliver light to nerves for optogenetic pain inhibition]]> Soft, implantable optical fibres that move and stretch with the body have been developed by researchers from the US for use in optogenetics studies. The tool will help scientists identify the mechanisms underlying nerve pain and other peripheral nerve disorders in animal models and help to develop new treatments.

n

Peripheral nerve pain is a condition that occurs when nerves outside of the brain and spinal cord become damaged. Its symptoms can include not only physical pain, but also tingling and numbness in the affected limbs. It is estimated that around 2.4% of people worldwide live with some form of peripheral neuropathy.

n

When it comes to studying nerve conditions in the brain, scientists can turn to optogenetics. This is a technique in which nerves, mainly in animal models, are genetically engineered to respond to light. By either activating or inhibiting a given nerve, researchers can gain information on how it works and interacts with its surroundings. In fact, optogenetics has already helped to trace the neural pathways underlying a variety of brain disorders, including those affecting mood and sleep, as well as addiction and Parkinson’s disease. The technique has also helped in the development of targeted therapies against these conditions.

n

To date, optogenetics has largely been confined to the brain, where rigid devices can be implanted relatively painlessly thanks to the lack of pain receptors – and where tissue movement is limited. In contrast, peripheral nerves experience near-constant pulling and pushing from the surrounding muscles and tissues.

n

As Siyuan Rao, a biomedical engineer at the University of Massachusetts at Amherst, explains in a press statement: “Current devices used to study nerve disorders are made of stiff materials that constrain movement, so we can’t really study spinal cord injury and recovery if pain is involved.” Rigid optogenetic implants also increase the risk of tissue damage. To make optogenetics more practical for nerves located outside of the brain – and potentially also safer within it – Rao and colleagues looked to develop a more flexible form of implant that could move with the body.

nn

Their solution is a soft, stretchable, transparent fibre made from hydrogel, a biocompatible mix of polymers and water. By fine-tuning the ratio of these ingredients, the team created jelly-like solutions peppered with nanoscale polymer crystals. They used two of these materials – each with a specific refractive index – to fashion the core and outer cladding layers of their optical fibre.

n

To test the design, the researchers implanted their fibres into mice whose nerves had been genetically modified such that they are activated by blue light and inhibited by yellow light. The team found that the mice were far less sensitive to pain when yellow laser light was sent along the fibre, with such illumination significantly inhibiting sciatic pain in the rodents.

n

The mice were able to run quite freely on a wheel with the fibres in place, with one end fixed to the skull and the other running down the leg, attached to the sciatic nerve via a flexible cuff structure. Furthermore, the implants remained robust and functional even after two months of running (more than 30,000 deformation cycles), overcoming the limitations of traditional hydrogels.

n

“Our fibres can adapt to natural motion and do their work while not limiting the motion of the subject. That can give us more precise information,” Rao notes. Co-author Xinyue Liu, now at Michigan State University, adds: “Now, people have a tool to study the diseases related to the peripheral nervous system, in very dynamic, natural and unconstrained conditions.”

n

“Light serves as a versatile tool for in vivo sensing, imaging and biomodulation. However, the significant optical attenuation in tissues has limited its applicability in various scenarios,” comments Seok-Hyun Andy Yun, a bio-optics expert from Harvard University who was not involved in the study. “The use of minimally invasive hydrogel fibres shows tremendous potential to overcome this limitation, paving the way for expanded applications of optical techniques in animal studies.”

n

“The novelty of this approach is the relative ease of use, as well as its efficacy and plasticity of usage, adds neurobiologist Federico Iseppon of University College London. “The limitation I would see in this technology is that, even if it surely is an easier approach than wireless implants for peripheral neuroscience studies, it still retains difficult challenges and requires specific technical expertise both in the production of the fibres and their surgical delivery to the tissues of interest.”

nn

With their initial study complete, the researchers are now working to scale up their fibres for use in larger animals, and to couple optogenetic control of nerves with the ability to record neural activity.

n

“We are focusing on the fibre as a new neuroscience technology. We hope to help dissect mechanisms underlying pain in the peripheral nervous system,” Liu added. She concluded: “With time, our technology may help identify novel mechanistic therapies for chronic pain and other debilitating conditions such as nerve degeneration or injury.”

n

In future, the team says, it may also be possible to use the same approach beyond peripheral nerves, targeting mobile organs like the heart and the gastrointestinal system.

n

The study is described in Nature Methods.

n

The post Flexible optical fibres deliver light to nerves for optogenetic pain inhibition appeared first on Physics World.

n

]]>
https://hadamard.com/c/flexible-optical-fibres-deliver-light-to-nerves-for-optogenetic-pain-inhibition/feed/ 0 98
Bridging minds and computers through physics https://hadamard.com/c/bridging-minds-and-computers-through-physics/ https://hadamard.com/c/bridging-minds-and-computers-through-physics/#respond Wed, 29 Nov 2023 08:57:42 +0000 https://physicsworld.com/?p=111496 Continue reading Bridging minds and computers through physics]]> n

n n

n

n

Brain–computer interfaces (BCIs) have enabled paralysed people to operate computers by thought alone. They have also been used to restore speech after it has been lost due to a stroke, and have shown promise for restoring sight to the blind. As applications expand, BCIs are even starting to be used for real or perceived mental augmentation.

n

This video explains how knowledge from physics and materials science is helping to improve BCIs – to make them safer, more durable and widely available. Learn more on this topic in the recent article ‘Plug me in: the physics of brain–computer interfaces’.

n

The post Bridging minds and computers through physics appeared first on Physics World.

n

]]>
https://hadamard.com/c/bridging-minds-and-computers-through-physics/feed/ 0 99
Robert Oppenheimer: how cinema has depicted this icon of the nuclear age https://hadamard.com/c/robert-oppenheimer-how-cinema-has-depicted-this-icon-of-the-nuclear-age/ https://hadamard.com/c/robert-oppenheimer-how-cinema-has-depicted-this-icon-of-the-nuclear-age/#respond Tue, 28 Nov 2023 09:17:15 +0000 https://physicsworld.com/?p=110759 Continue reading Robert Oppenheimer: how cinema has depicted this icon of the nuclear age]]> In the summer of 1960 I set off for the Los Alamos National Laboratory in New Mexico, having just finished my bachelor’s degree in physics from the Polytechnic Institute of Brooklyn, now part of New York University. I had gained a high Q-level security clearance and was qualified to enter Los Alamos on a summer programme for students. It was only 15 years after Robert Oppenheimer and his team of scientists and engineers on the Manhattan project had detonated the world’s first atomic bomb – the famous 1945 Trinity test – but a sense of atomic history already pervaded the lab.

n

My research group reported to Stanislaw Ulam, the Polish mathematician who had co-invented a working hydrogen bomb with Edward Teller barely a decade earlier. Another member of the group, meanwhile, had helped assemble the Trinity bomb. Holed away on this desert plateau, which sits more than 2200 m above sea level, my abiding impression of Los Alamos was of the thin, crystalline air – flooded with sunshine – that seemed to promote a kind of otherworldly thinking. It was as if these strange conditions were needed for those great minds to develop their world-shaking bomb.

n

Oppenheimer 2024 movie Cillian Murphy

n

Most people, however, have never experienced Los Alamos first hand as I did. Instead, their impressions of Oppenheimer and the Manhattan project will rest on the many movies, documentaries and books made about that war-time era. Interest in his life and legacy is perhaps higher than ever thanks to Christopher Nolan’s blockbuster film Oppenheimer (2023). A huge box-office hit, it is, however, just the latest of many efforts to present the origins of the nuclear age, its science, people and policies including Oppenheimer’s central role.

nn

Nolan’s film tells the Los Alamos and Trinity stories chiefly through Oppenheimer’s story. He is depicted as a person, a scientist and a scientific leader, with the main narrative thread being the loss of his security clearance in 1954 – under suspicion of being a Soviet spy – following an investigation and interrogation by the Atomic Energy Commission (AEC). He is well played by Cillian Murphy, whose subtle facial expressions and body language show the many layers of Oppenheimer’s complex mind and personality: his blend of arrogance and naiveté; the scale of his emotions as he reacts to personal tragedy or to the atomic bombing of Japan.

n

The movie, for me, is a compelling portrait of a man who bore the burden of having created a terrible weapon that killed tens of thousands of people. He then faced the bitter irony that the same government and country that had asked him to build it declared him to be untrustworthy, ending any further involvement of his in building or advising on nuclear weapons. But even with a running time of three hours, the film cannot fully tell the complex and difficult story of Oppenheimer and the bomb. Fortunately, there are many other movies as well as books and plays (see box below) to turn to.

n

Oppenheimer through the decades

n

The very first cinematic portrayal – The Beginning or the End – was released in 1947, barely two years after the end of the war. Part fiction, it is framed as a documentary about the Manhattan Project, made for the benefit of future humanity, should we survive the nuclear age. It tells the story of the bomb from the discovery of nuclear fission to the destruction of Hiroshima and Nagasaki. Actors play Oppenheimer (although he is not a major character), Albert Einstein and General Leslie Groves – the military head of the Manhattan Project – and others in fictionalized but more-or-less historically and scientifically valid scenes.

n

n

Significantly, the film is ambivalent about the morality of using the bomb. Members of the fictional bomber crew at Hiroshima are stunned by the inferno they have wrought, but imply that it is payback for Japan’s treacherous attack on Pearl Harbor. A fictional young physicist on the bomb project is its conscience, regularly expressing doubts about the bomb. As he dies of radiation sickness, he wonders if this is retribution for working on the bomb. In a bizarre final scene, though, his voice from the grave predicts that atomic energy will give humanity a golden future.

n

As Los Alamos and knowledge of nuclear war entered the general consciousness, it wasn’t long before science fiction got in on the act. Several science-fiction movies in the 1950s featured atomic blasts or monsters created by nuclear radiation, notably Godzilla (1954), in which radiation awakens a gigantic prehistoric reptile that rampages through Tokyo. The Day the Earth Stood Still (1951) presented an equally bleak message, as an alien emissary warns humanity to be careful with nuclear weapons or face dreadful consequences.

n

Other feature films about nuclear war were just as sombre but more realistic. In On the Beach (1959), a catastrophic global nuclear exchange occurs (possibly by accident), after which the inhabitants of Australia and an American nuclear submarine crew despairingly await a radioactive cloud that will kill these last remnants of humanity. Then there is the classic French New Wave film Hiroshima Mon Amour (1959), which intertwines our perceptions of Hiroshima’s nuclear devastation and of a hopeless love affair to heighten our responses to both.

n

Later movies to deal memorably with nuclear war include Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb (1964) and Fail-Safe (1964). Only in 1989, though, did another feature film depict the Manhattan Project. That was Fat Man and Little Boy, which uses the code names for the bulky Nagasaki plutonium bomb and smaller Hiroshima uranium bomb. Oppenheimer (Dwight Schultz) features prominently in the film, but he is overshadowed by Paul Newman as General Groves, though both are superficially drawn.

n

n

The film does, however, present the technical challenges in developing the bomb, such as designing trigger mechanisms to rapidly bring sub-critical pieces of fissionable material to critical mass and initiate the nuclear explosion. Fat Man and Little Boy also spotlights nuclear dangers, as a fictional Los Alamos physicist dies miserably from radiation in circumstances portrayed like those that killed two real physicists, Harry Daghlian and Louis Slotin, who died after Trinity while conducting experiments that want horrifically wrong.

n

Bomb documentaries

n

The 1980s saw the start of a number of documentaries about the building of the bomb, the most important of which is The Day After Trinity (1981). It relies solely on real US government footage, newsreels and photos. Directed by Jon Else, it also uses filmed interviews with 20 people who knew or worked with Oppenheimer or who were affected by the atomic bomb project. There are even archival appearances by Oppenheimer and other major figures such as US President Harry Truman.

n

The documentary vividly portrays Oppenheimer’s life, intellect and thoughts. Hans Bethe, who headed the theory section at Los Alamos and later won the 1967 Nobel Prize for Physics for his work on stellar nucleosynthesis, is shown raising one of many questions about Oppenheimer’s complex personality. “We ask,” he wonders on screen, “why people with a kind heart and humanist feelings [would] work on weapons of mass destruction.”

nn

One answer comes from Oppenheimer’s close friend, Berkeley professor Haakon Chevalier. In an interview in the film, he explains that Oppenheimer, who was born in the US into a Jewish family with strong ties to Europe, had been greatly alarmed by the rise of Nazism. We learn too about Oppenheimer’s rare scientific talent, with Bethe claiming he was “intellectually superior” to everyone at Los Alamos. “[He] knew and understood everything…chemistry or theoretical physics or machine shop. He could keep it all in his head.”

n

Like The Beginning or the End, the film follows the story through to Hiroshima but treats moral questions more deeply. Courageously, it includes painful footage of the suffering of burned and injured adults and children after the Hiroshima bombing, turning abstract issues of morality into the real and devastating consequences for innocent people. It also shows that some Los Alamos scientists were concerned about the moral issues the bomb would raise.

n

One was the physicist Robert Wilson, who headed the experimental research division at Los Alamos and later became the first director of the Fermi National Acceleratory Laboratory in the US. In the film Wilson tells how, sometime between April 1945 and the Trinity test in July, he called a meeting about whether work on the test bomb should continue. Oppenheimer tried to dissuade him, but the meeting went ahead anyway. Oppenheimer told the scientists present that the Trinity test was essential so the world would know that this “horrible thing” existed as the new United Nations was being formed. The remarks convinced the attendees to continue preparing the bomb, though, post-war, Wilson gave up his security clearance and never again worked on nuclear energy or bombs.

n

n

In The Day After Trinity, an interviewer is shown asking Oppenheimer in the 1960s about controlling the spread of nuclear weapons. “It’s 20 years too late,” Oppenheimer says quietly but firmly. “It should’ve been done the day after Trinity.” His idealistic wish for international nuclear control and his opposition to the hydrogen bomb are well known. Indeed, they weighed against him in the 1954 hearing, the stage for which was partly set by the rabid anti-communism of US Senator Joseph McCarthy.

n

Among those who testified for Oppenheimer were the Nobel laureates Enrico Fermi and Isidor Rabi as well as Bethe and Groves; his former colleague Edward Teller, who championed the hydrogen bomb, spoke against him. But as The Day After Trinity also shows, Oppenheimer’s own unforthcoming testimony served him poorly. As Robert P Crease explains elsewhere in Physics World, he was flummoxed in questioning by attorney Roger Robb, who accused Oppenheimer of going beyond science and trying to counsel on military strategy.

n

The film makes clear that the revocation of Oppenheimer’s clearance was a great blow. His physicist brother Frank tells us “it really knocked him for a loop;” Bethe relates that “he was not the same person afterwards”; and Rabi says the revocation “actually almost killed him spiritually, yes. It achieved what his opponents wanted to achieve. Destroyed him.”

n

Oppenheimer in literature and on stage

n

The inherent drama of the atomic-bomb story, its moral issues, and the intricacies of Robert Oppenheimer’s character have inspired not just countless movies and documentaries (see main text) but also stage plays and an opera. Perhaps the earliest of these is In the Matter of J Robert Oppenheimer by German playwright Heinar Kipphardt, which was first performed in 1964. Whereas Christopher Nolan’s Oppenheimer film weaves the Atomic Energy Commission hearing through a larger story, Kipphardt’s play is set entirely inside the hearing room and is based on thousands of pages of actual testimony. One reviewer in the New York Times said that a 2006 off-Broadway revival posed “questions about moral relativism, the limits of vigilance and human decency”.

n

Oppenheimer by the RSC

n

n

Later, Oppenheimer by the British dramatist Tom Morton-Smith took a broader view. Premiered by the Royal Shakespeare Company in 2015, it starts with Oppenheimer’s left-wing connections in the 1930s and ends with the Trinity test. It includes the physics of the bomb, depicts figures such as Edward Teller, and comments on Oppenheimer’s moral stance toward building the bomb. Reviewers noted the epic Shakespearean sweep of Oppenheimer’s rise and fall: Physics World credited the play with carrying “considerable emotional punch”, while the Guardian said it evoked “an overall ache for humanity”. Later, the Los Angeles Times said of a California revival in 2018 that “the physics is dazzling, but even more intriguing are the complicated human beings behind the equations”.

n

If these stories are indeed epic, opera is surely the most powerful medium for telling them, as in Doctor Atomic by American composer John Adams with libretto by Peter Sellars. First presented at the San Francisco Opera in 2005, it concentrates on the reactions of Oppenheimer and others at Los Alamos as tension escalates with the approach of the Trinity test. Writing in Physics World, the historian Robert P Crease called one haunting scene, which conveys the turmoil in Oppenheimer’s soul that he had never openly expressed, “opera at its finest”. But Crease and others took issue with the characterizations of some of the leading figures. A review of a 2018 production at the Santa Fe Opera near Los Alamos says it does “spectacle” well, but “conveys a feeling of grief…rather than telling a story”.

n

We should not forget either the countless books about the nuclear age, two of the most famous of which each won a Pulitzer Prize. The first is Richard Rhodes’ The Making of the Atomic Bomb (1986), which is the authoritative study of the bomb project and its leading figures, including Oppenheimer. The other is American Prometheus: the Triumph and Tragedy of J Robert Oppenheimer (2005) by journalist Kai Bird and historian Martin J Sherwin. Perhaps the definitive Oppenheimer biography, it inspired Oppenheimer the movie and, as its title shows and as the film replicates, depicts Oppenheimer’s fall from grace in 1954.

n

n

For every generation

n

Taken together, these four movies – The Beginning or the End, The Day After Trinity, Fat Man and Little Boy and Oppenheimer – convey the urgency of the atomic project well. Fictional parts aside, they provide a reasonably accurate picture of the start of the nuclear era, while giving a decent scientific explanation of nuclear chain reactions, the difficulties of obtaining enough uranium-235 and plutonium to make bombs, and the technical ingenuity that made the bomb work. The strategic and political thinking behind the decision to bomb Japan – and the opposition to that step – are covered too.

n

But why do we need to keep recreating the story? One answer comes from Else, who directed The Day After Trinity. As he recently stated: “These stories have to be retold every generation, and they have to be told by new storytellers.” Nuclear weapons, in other words, are so dangerous that we have to underline their menace in new and different ways. Oppenheimer does this by focusing on the personality of Oppenheimer himself and by bringing a roster of Hollywood A-listers.

nn

Excellent though the acting is in Oppenheimer, I feel it is The Day After Trinity that more powerfully shows us the real man and his contradictions, thanks also to comments from those who knew him. Rabi describes, for example, how Oppenheimer proudly strode along immediately after the Trinity blast, like a gunslinger in the classic film High Noon (1952). Later, however, as Rabi reminds us, Oppenheimer spoke out against the hydrogen bomb because it would not serve as a military weapon but only to kill civilians.

n

Oppenheimer’s doubts are made clear in his photo at the time of the AEC hearing, which shows the gaunt cheeks and haunted eyes of a man who has been spiritually tested and torn by building the bomb as was asked of him, seeing its destructive use that won the war, then finding himself rejected and his career destroyed. It is, in a sense, a tragedy, and why the book American Prometheus was so aptly titled. Oppenheimer was a scientific leader in a time and place that forced him, and others, into impossible moral choices.

n

A final chapter

n

Oppenheimer is not the final word. Unmentioned in the film is that in December 2022 Jennifer Granholm – secretary of the US Department of Energy, the successor to the AEC – announced that she had annulled the revocation of Oppenheimer’s security clearance. This was being done, Granholm said, to correct the record and honour his “profound contributions to our national defense and scientific enterprise at large”. This was primarily due to efforts by the authors of American Prometheus.

n

Ground zero after the Trinity test

n

I can, however, personally attest that the scientific community not only rejected the original AEC decision but also revered Oppenheimer. As a graduate physics student in the early 1960s at the University at Pennsylvania, I went to hear him give a public lecture to a crowd of hundreds filling a large auditorium. Then nearly 60, he looked – from my vantage point in the hall – frail and even ethereal, but he must have had a tough core that sustained him through Los Alamos and the AEC hearing to stand before many eager to hear him.

n

Looking back, it’s clear that the atomic-bomb project affected the entire physics community. Oppenheimer, Einstein and others spoke out against the dangers of nuclear war, and physicists still do, through organizations such as the Bulletin of Atomic Scientists and Scientists for Global Responsibility.

n

But as the US historian Daniel Kevles wrote in his 1978 book The Physicists: the History of a Scientific Community in Modern America, the success of the Manhattan Project also gave physicists “the power to influence policy and obtain state resources largely on faith”. Nuclear and high-energy physics benefitted from this new regard, but it also raised the prestige of physics in general and led to more financial support. That too is part of the complex scientific legacy and moral reckoning from the story of Oppenheimer and the atomic bomb.

n

As for me, my last direct link with the nuclear era came in 2002, when with other physicists attending a meeting in Albuquerque, I had the rare chance to visit the Trinity site at Alamogordo, New Mexico. A small stone pyramid with a plaque marked ground zero, in the midst of a nearly infinite sweep of land. The natural barrenness was a sign of what a nuclear bomb could do to a city. Near the pyramid, a fence surrounded a small mound of weathered concrete and metal. This was a remaining trace of the 30 metre-tall steel tower atop which the bomb was detonated, and which had vanished in the blink of an eye.

n

The post Robert Oppenheimer: how cinema has depicted this icon of the nuclear age appeared first on Physics World.

n

]]>
https://hadamard.com/c/robert-oppenheimer-how-cinema-has-depicted-this-icon-of-the-nuclear-age/feed/ 0 103
Optics and instrumentation firms share in the 2023 Institute of Physics business awards https://hadamard.com/c/optics-and-instrumentation-firms-share-in-the-2023-institute-of-physics-business-awards/ https://hadamard.com/c/optics-and-instrumentation-firms-share-in-the-2023-institute-of-physics-business-awards/#respond Mon, 27 Nov 2023 13:59:51 +0000 https://physicsworld.com/?p=111034 Continue reading Optics and instrumentation firms share in the 2023 Institute of Physics business awards]]> Last month I highlighted some of the medical-physics companies that won business innovation awards from the Institute of Physics in 2023. But firms in the photonics and instrumentation sectors have done well too, which is perhaps not surprising given that photonics is one of the UK’s largest physics-based industries. As the Photonics Leadership Group (PLG) said in a recent statement, the UK photonics sector is now worth £15.2bn, with like-for-like revenue growth of more than 7% between 2020 and 2022.

n

Having sat on the judging panel for the IOP’s business awards, I can say that there were some fantastic entries

n

n

The PLG, which includes representatives from more than 60 photonics businesses in the UK, says that there is increased demand for photonics in everything from agriculture, health and communications to defence, satellites and manufacturing. It also points to an increasing commercialization of the burgeoning quantum-technology sector. Indeed, the PLG forecasts that the UK photonics sector will be worth more than £17bn in 2024 and grow to £50bn by 2035.

n

Having sat on the judging panel for the IOP’s business awards, I can say that there were some fantastic entries, which meant that picking winners wasn’t easy. In no particular order, however, the first winner I’d like to mention is Glasgow-based Coherent Scotland, which makes ultrafast lasers for applications in the life sciences and industry. The firm’s philosophy is to transform complex, physics-based optical technology into equipment that’s easy to use, even by people who aren’t laser experts. Their award recognizes in particular the company’s Axon range of femtosecond lasers, which it has developed over the last six years.

nn

Operating at three wavelengths that are of particular use in biology (780, 920 and 1064 nm), the lasers can be easily fitted into existing imaging and industrial tools where space is at a premium. They exploit the technique of chirped pulse amplification – which was recognized by the 2018 Nobel Prize for Physics – by taking short, low-power pulses and stretching them out in time. The pulses, which now have a lower peak power, can then be safely amplified before being recompressed into short, higher-peak-power pulses.

n

Traditionally, this technique is only ever found in large, facility-sized laser systems. But the Axon lasers, being so small, have already been used by neuroscientists for in-vivo brain imaging and studying neurodegenerative diseases such as Alzheimer’s and Parkinson’s. Other applications include cancer diagnostics and drug research, pharmaceutical testing, tropical-disease control and immunology. The future looks bright for the physics of ultrafast laser technology in the biological sciences.

n

Space success

n

Another winner is KEIT Industrial Analytics. Based at Harwell in Oxfordshire, it makes infrared Fourier transform spectrometers for monitoring and controlling industrial production processes. The technology was originally conceived at the Rutherford Appleton Laboratory in the late 2000s as a compact, rugged high-performance instrument for analysing the atmosphere of Mars. KEIT was spun out of the Rutherford lab in 2012.

n

Containing no moving parts, the company’s spectrometers use simple optics to generate an interferogram along a detector array. The data can then be Fourier transformed to return the whole spectrum, creating a robust instrument that can be used directly for instantaneous in-line analysis in production plants. That in turn offers real-time spectral data as opposed to delayed, off-line sample analysis which has huge advantages in process control applications on yield and product quality.

n

KEIT’s technology is sold around the world to companies that use it for in-situ monitoring of everything from bioethanol and biofuels production to pulp and paper manufacturing. With the global analytical instruments market expected to grow at almost 4% a year to $71.4bn by 2027, I think KEIT has a bright future. The original Rutherford technology continues to be developed for future off-planet missions so watch this space.

n

Focal Point Positioning, meanwhile, won a 2023 IOP business award for its pioneering location-awareness technology. Based in Cambridge, the company was founded in 2015 by the physicist Ramsey Faragher, who was once dubbed “the real life Q” – in a nod to James Bond’s R&D guru – by Top Gear magazine. Faragher invented a technology that improves the accuracy, sensitivity and security of devices that use location data from global-navigation satellite systems (GNSS).

n

GNSS is a massive business area. More than $1trillion of the US economy and over €800bn of the European economy currently depends on this incredible positioning and timing system. However, GNSS is not perfect, especially in cities, where signals can bounce off buildings, causing location-technology devices – in anything from cars to mobile phones – to be inaccurate. It’s why the blue dot on your GoogleMaps can be in the wrong place or why your Uber driver can’t find you.

n

The company’s “Supercorrelation” technology gets round this problem by working out the direction of incoming signals through special software rather than needing costly antennas or other infrastructure. By detecting – and ignoring – fake signals, and recognizing reflected signals, the company’s technology allows metre-level positioning to be maintained on smartwatches, fitness devices and other consumer products even in the trickiest environments.

nn

The company has more than 25 patent families and four trademarks on its technology. But Focal Point doesn’t make its own products. Instead, it licenses its technology to other firms, including Switzerland-based u-blox, which is the largest chipset designer in Europe. The company is also currently involved in licensing discussions with various leading global smartphone and car manufacturers, and has a partnership with General Motors already in place.

n

Finally, let me mention turboTEM, which won an IOP start-up award for its equipment that can make electron microscopes work better and last longer. Based in Dublin, Ireland, turboTEM was set up in 2022 by researchers from the ultramicroscopy research group at Trinity College, who wanted to take a number of emerging technologies from the lab out into the field. The company was recognized for its modular devices that can be easily and cheaply retrofitted to what is a vital imaging tool.

n

Electron microscopes are great tools but they don’t come cheap, typically costing millions of pounds. Users who want access to the latest advances and improvements in performance can’t easily go out and buy a new one, which is why an upgrade makes a lot of sense. So by extending the useful life of the instrument, turboTEM enables more cutting-edge science and development to be performed on existing tools.

n

In it to win it

n

As I have mentioned before, all physics-based firms require time and energy to develop products and become globally significant. There’s also the perennial difficulty of explaining a product idea, which is often quite specialized, to potential investors who have little or no science background. An IOP start-up award can therefore show that your technology has won approval from judges with solid physics and business experience.

n

As the chairperson of one award-winning company put it at a presentation held at the UK Houses of Parliament at the end of October, the award is “an extra solid data and trusted point for potential investors”. The high-profile event was hosted by the physicist and MP Alok Sharma, who has been a long-time supporter of the IOP business awards and served as president of the COP26 conference in Glasgow in 2021.

n

As the chairperson of one winning company put it at a presentation held at the UK Houses of Parliament, the award is “an extra solid data point for potential investors”

n

n

I hope, therefore, that your company, if you have one, will be inspired to apply. And even if you don’t work in a business, remember the IOP also offers three awards (Katharine Burr Blodgett, Denis Gabor and Clifford Paterson) for individuals or teams who have done innovative physics with a commercial angle. Good luck – and remember, you have to be in it to win it. Award entries for 2024 will be open soon so watch this space.

n

The post Optics and instrumentation firms share in the 2023 Institute of Physics business awards appeared first on Physics World.

n

]]>
https://hadamard.com/c/optics-and-instrumentation-firms-share-in-the-2023-institute-of-physics-business-awards/feed/ 0 105
Fusion industry outlines ambitious plans to deliver electricity to the grid by 2035 https://hadamard.com/c/fusion-industry-outlines-ambitious-plans-to-deliver-electricity-to-the-grid-by-2035/ https://hadamard.com/c/fusion-industry-outlines-ambitious-plans-to-deliver-electricity-to-the-grid-by-2035/#respond Wed, 22 Nov 2023 11:18:59 +0000 https://physicsworld.com/?p=111391 Continue reading Fusion industry outlines ambitious plans to deliver electricity to the grid by 2035]]> What is the Fusion Industry Association (FIA)?

n

The FIA is the independent business association for privately financed fusion-energy companies. We have 38 members, all of whom have different approaches to reaching commercial fusion energy.

n

How do you become a member?

n

You need to have raised private capital and demonstrate a plan for building a fusion power plant that will sell the power it generates. All of the FIA companies think that they can get there within the coming decades and have investors who believe it is possible.

n

Why fusion?

n

Fusion energy is clean, safe, sustainable, always on and always available with a fuel source that is virtually unlimited. By unlimited I mean that we have hundreds of millions or even billions of years of fuel here on Earth. Fusion creates no carbon emissions, no greenhouse-gas emissions and has no long-lived nuclear waste. There’s also no threat of having a nuclear meltdown so there is no safety impact to the public. It’s basically everything you could want from an energy source.

n

So why don’t we have it already?

n

It is scientifically very hard to do. We’re still at the point where we need to do a lot of engineering and still some science to get there. The challenge is not just to create fusion, but to create fusion with net energy – to get more energy out of the fusion reaction than you put in. But we believe we’re on the way to doing this and we believe the signposts are there that show we’re not in the realm of science fiction. In a decade or so we can have commercial fusion energy, putting clean, safe, sustainable fusion energy into the grid.

nn

What other applications are there?

n

One is to use fusion to generate medical isotopes, which can be very helpful for certain forms of cancer treatment or in medical imaging. These are things that are happening right now. Another is fusion propulsion for space applications, which could mean getting from low-Earth orbit to Mars in a matter of weeks or a month instead of years. But the real “killer app” is energy production. We need alternative approaches and fusion is the ultimate energy source. What we need to meet the climate challenge is to have always on, always available, zero-carbon energy.

n

What are some of the ways to generate fusion? 

n

On one end is laser inertial fusion energy. This involves using a laser or other driver to put a lot of energy onto a very small target, creating an extreme pressure situation where fusion happens. At the other end of plasma physics is magnetically confined fusion, using powerful magnets to contain the plasma in a steady state, which also takes a lot of energy.

n

And laser fusion has seen some recent success?

n

Yes, last year physicists and engineers at the laser-based National Ignition Facility in California reported an energy gain in one of their fusion shots. We really see this as a Wright brothers moment. The Wright brothers understood that planes would fly and it took a whole new area of science – aeronautical engineering – to be able to do this. Likewise, we think we’re there with fusion and the plane has flown. We’re not selling it yet, but we’re on our way.

n

Andrew Holland

n

What do you think of other large-scale experiments, such as ITER, which is currently being built in Cadarache, France, and has been beset by delays and cost hikes?

n

ITER is being built with a very different need or approach than that of private approaches to fusion. ITER is, of course, an essential science experiment and an important example of how countries can work together. ITER was put together as low risk in terms of technology, but not low risk in terms of cost. ITER was designed with 1990s technology, but things have advanced so much since then. If you were building a computer today, you would build it with today’s technology, not 1990s technology. There is really important science that is going to come out of ITER but because of the huge costs involved it is too big to fail.

n

But private companies also don’t want to fail, right?

n

That’s right, but the way the markets work is: you try things and you fail. If you talk to a venture capital investor, they don’t want any of their investments to fail, but also they kind of expect them to. They look for that one out of 10 or even one out of 100 that will pay for the whole investment fund. So it’s a very different model.

n

Companies are coming into fusion because there is a market need, and on the supply side, the science is ready

n

n

In the FIA’s Global Fusion Industry in 2023 report, you identify 43 fusion companies, up from 33 in 2022. What is driving this growth?

n

Companies are coming into fusion because there is a market demand, and on the supply side, the science is ready. It is not like these companies are all doing the same thing; they’re all racing against each other and other technologies to meet the climate challenge.

n

Why is it important to have a wide range of approaches?

n

We don’t want to down-select too early. We shouldn’t say that one approach is going to be the only approach that’s going to work. The lesson from other technologies is that you need the market for it to work and you need to have competition to see what is the appropriate way forward.

n

The report also highlights that many in the industry expect to deliver electricity to the grid by 2035. What needs to happen to achieve this?

n

You have to do multiple things in parallel instead of in a sequential order. Many companies are building their proof-of-concept machine to prove net energy in a commercially relevant fusion plasma. If they can do this within the next four years, then they can move on to building a pilot plant.

n

And what then?

n

A pilot plant will do the science plus integrate the important engineering, such as being able to generate its own fuel through the interactions of neutrons with the wall. Just being able to have a pilot plant generating electricity, either at first in minutes and hours but then ultimately weeks, months and years is going to be a process. The first electricity to be produced in the 2030s won’t be cheap but we think that there is a pathway towards ultimately producing cheap electricity.

n

    n

  • You can listen to a longer version of this interview in the 12 October episode of the Physics World Weekly podcast
  • n

n

The post Fusion industry outlines ambitious plans to deliver electricity to the grid by 2035 appeared first on Physics World.

n

]]>
https://hadamard.com/c/fusion-industry-outlines-ambitious-plans-to-deliver-electricity-to-the-grid-by-2035/feed/ 0 113
The biographer who inspired Christopher Nolan’s blockbuster film Oppenheimer https://hadamard.com/c/the-biographer-who-inspired-christopher-nolans-blockbuster-film-oppenheimer/ https://hadamard.com/c/the-biographer-who-inspired-christopher-nolans-blockbuster-film-oppenheimer/#respond Tue, 21 Nov 2023 18:22:00 +0000 https://physicsworld.com/?p=111437 Continue reading The biographer who inspired Christopher Nolan’s blockbuster film Oppenheimer]]>  

This episode of the Physics World Stories podcast features an interview with Kai Bird, co-author of the book that inspired the recent blockbuster film Oppenheimer, directed by Christopher Nolan. Winner of the 2006 Pulitzer Prize in Biography, American Prometheus: the Triumph and Tragedy of J. Robert Oppenheimer is an exploration of the brilliant and enigmatic physicist who led the project to develop the world’s first atomic weapons.

 

Oppenheimer is a fascinating but complicated character for a biographer to tackle. Despite excelling in his leadership of the Manhattan Project, Oppenheimer’s conscience was torn by the power he had unleashed on the world. “Now I am become Death, the destroyer of worlds,” is the line he infamously recalled from the Hindu scripture the Bhagavad Gita, upon witnessing the Trinity Test fireball in 1945.

 

Parallels between the nuclear dawn and AI today

The physicist’s relationship with politics was also fraught and difficult to define. Oppenheimer held personal connections with Communist Party members prior to the Second World War, and spent the post-war years warning against nuclear proliferation – provoking the ire of McCarthy Era politicians and ultimately having his security clearance revoked in 1954.

 

Unsurprisingly, American Prometheus is receiving a resurgence of interest following the success of Nolan’s film. Readers are fascinated once again with the dawn of the nuclear age, which Bird says has parallels with where we are today with AI and the threat of climate change. He also sees the political threads from McCarthyism to the post-truth tactics and populist playbook deployed in US politics today.

As always, the podcast is presented by Andrew Glester and you can read his review of the film Oppenheimer, as well as a recent opinion piece by Robert P Crease “What the movie Oppenheimer can teach today’s politicians about scientific advice“.

The post The biographer who inspired Christopher Nolan’s blockbuster film <em>Oppenheimer</em> appeared first on Physics World.

 

]]>
https://hadamard.com/c/the-biographer-who-inspired-christopher-nolans-blockbuster-film-oppenheimer/feed/ 0 115
Weak measurement lets quantum physicists have their cake and eat it https://hadamard.com/c/weak-measurement-lets-quantum-physicists-have-their-cake-and-eat-it/ https://hadamard.com/c/weak-measurement-lets-quantum-physicists-have-their-cake-and-eat-it/#respond Mon, 20 Nov 2023 15:00:22 +0000 https://physicsworld.com/?p=111377 Continue reading Weak measurement lets quantum physicists have their cake and eat it]]> Diagram of the entanglement certification scheme

n

Compared to scribbling mathematical expressions for entangled quantum states on a sheet of paper, producing real entanglement is a tricky task. In the lab, physicists can only claim a prepared quantum state is entangled after it passes an entanglement verification test, and all conventional testing strategies have a major drawback: they destroy the entanglement in the process of certifying it. This means that, post-certification, experimenters must prepare the system in the same state again if they want to use it – but this assumes they trust their source to reliably produce the same state each time.

n

In a new study, physicists led by Hyeon-Jin Kim from the Korea Advanced Institute of Science and Technology (KAIST) found a way around this trust assumption. They did this by refining conventional entanglement certification (EC) strategies in a way that precludes complete destruction of the initial entanglement, making it possible to recover it (albeit with probability < 1) along with its certification.

n

A mysterious state with a precise definition

n

Entanglement, as mysterious as it is made to sound, has a very precise definition within quantum mechanics. According to quantum theory, composite systems (that is, two or more systems considered as a joint unit) are either separable or entangled. In a separable system, as the name might suggest, each subsystem can be assigned an independent state. In an entangled system, however, this is not possible because the subsystems can’t be seen as independent; as the maxim goes, “the whole is greater than its parts”. Entanglement plays a crucial role in many fields, including quantum communication, quantum computation and demonstrations of how quantum theory differs from classical theory. Being able to verify it is thus imperative.

nn

In the latest work, which they describe in Science Advances, Kim and colleagues studied EC tests involving multiple qubits – the simplest possible quantum systems. Conventionally, there are three EC strategies. The first, called witnessing, applies to experimental situations where two (or more) devices making measurements on each subsystem are completely trusted. In the second, termed steering, one of the devices is fully trusted, but the other isn’t. The third strategy, called Bell nonlocality, applies when none of the devices are trusted. For each of these strategies, one can derive inequalities which, if violated, certify entanglement.

n

Weak measurement is the key

n

Kim and colleagues reconditioned these strategies in a way that enabled them to recover the original entanglement post certification. The key to their success was a process called weak measurement.

n

In quantum mechanics, a measurement is any process that probes a quantum system to obtain information (as numbers) from it, and the theory models measurements in two ways: projective or “strong” measurements and non-projective or “weak” measurements. Conventional EC strategies employ projective measurements, which extract information by transforming each subsystem into an independent state such that the joint state of the composite system becomes separable – in other words, it completely loses its entanglement. Weak measurements, in contrast, don’t disturb the subsystems so sharply, so the subsystems remain entangled – albeit at the cost of lesser information extraction compared to projective measurements.

n

The team introduced a control parameter for the strength of measurement on each subsystem and re-derived the certifying inequality to incorporate these parameters. They then iteratively prepared their qubit system in the state to be certified and measured a fixed sub-unit value (weak measurement) of the parameters. After all the iterations, they collected statistics to check for the violation of the certification inequality. Once a violation occurred, meaning that the state is entangled, they implemented further suitable weak measurements of the same strength on the same subsystems to recover the initial entangled state with some probability R (for “reversibility”).

n

Lifting the trust assumption

n

The physicists also demonstrated this theoretical proposal on a photonic setup called a Sagnac interferometer. For each of the three strategies, they used a typical Sagnac setup for a bi-partite system that encodes entanglement into the polarization state of two photons. This involves introducing certain linear optical devices to control the measurement strength and settings for the certification and further retrieval of the initial state.

n

As predicted, they found that as the measurement strength increases, the reversibility R goes down and the degree of entanglement decreases, while the certification level (a measure of how much the certifying inequality is violated) for each case increases. This implies the existence of a measurement strength “sweet spot” such that the certification levels remain somewhat high without too much loss of entanglement, and hence reversibility.

nn

In an ideal experiment, the entanglement source would be trusted to prepare the same state in every iteration, and destroying entanglement in order to certify it would be benign. But a realistic source may never output a perfectly entangled state every time, making it vital to filter out useful entanglement soon after it is prepared. The KAIST team demonstrated this by applying their scheme to a noisy source that produces a multi-qubit mixture of an entangled and a separable state as a function of time. By employing weak measurements at different time steps and checking the value of the witness, the team certified and recovered the entanglement from the mixture, lifting the trust assumption, and using it further for a Bell nonlocality experiment.

n

The post Weak measurement lets quantum physicists have their cake and eat it appeared first on Physics World.

n

]]>
https://hadamard.com/c/weak-measurement-lets-quantum-physicists-have-their-cake-and-eat-it/feed/ 0 119
Why the Institute of Physics launched a campaign to get the media to ‘Bin the boffin’ https://hadamard.com/c/why-the-institute-of-physics-launched-a-campaign-to-get-the-media-to-bin-the-boffin/ https://hadamard.com/c/why-the-institute-of-physics-launched-a-campaign-to-get-the-media-to-bin-the-boffin/#respond Mon, 20 Nov 2023 11:00:50 +0000 https://physicsworld.com/?p=111123 Continue reading Why the Institute of Physics launched a campaign to get the media to ‘Bin the boffin’]]> It is not often that a campaign launched by a scientific learned society finds itself featured on the front page of the Daily Star. But that is what happened, not once, but twice over the past year when the paper responded to a call from the Institute of Physics (IOP), which publishes Physics World, for the media in the UK and Ireland to “Bin the boffin”. The campaign aims to persuade journalists to stop using the outdated slang term “boffin” as a catch-all to describe any scientist, technician, researcher or expert who happens to be the subject of their coverage.

n

As boffin is mostly a term used by the red-top tabloids, we initially directed our call towards them (although it is worth mentioning there are other offenders – the Economist is also rather fond of the term, for example). The impact of the campaign was greater than we could have hoped. There was a quick win when the editor of the Daily Mirror clarified that the word should not be used by its reporters, while the Daily Star ran a defence of the word on its front page and continues to use it. Only the Sun refused to engage.

n

We believe that boffin is a lousy way to talk about scientists. The term has negative impacts – it is poorly understood, strongly associated with the male gender and is confusing. When we surveyed our members last year, they told us that the term was unhelpful and inaccurate, with younger members stating it actively puts them off science. To be clear, the IOP isn’t seeking to ban the word. If a pub quiz team, say, wants to be called “Brilliant boffins” that’s fine and if scientists don’t mind the word, then we would consider that a matter of personal taste.

n

Two Daily Star front pages with large headlines including the word boffin

n

But when it comes to reporting important discoveries, trends in science, breakthroughs and new techniques, we believe the media should use something more accurate, such as “scientist”. It’s worth recalling that in the early 19th century the term “scientist” was considered by the British press as an ugly Americanism, with a preference instead for “man of science”, which goes to show that times and language can and do change – often for the better.

n

Our good-humoured call to bin the boffin is intended to start a conversation about how the media portrays scientists and science. This has already had an impact, with follow-up interviews on national radio where we could make broader points about how media stereotypes shape perceptions. By raising concerns about that one word, we also had the chance to talk, for example, about the way stock photos are used, with their tendency to focus on single, heroic scientists, rather than teams, which further cements the tendency to assume that physicists are likely to be white men. The campaign is also designed to draw attention to the IOP’s new guidelines on science reporting, which otherwise might have gone under the radar.

nn

Hopefully, this effort can help increase the diversity of the scientific population. We all know that physics has a representation and equity problem. It is why the IOP launched its Limit Less campaign, of which the boffin-binning initiative is a part, to break down the prejudice and stereotypes that leave too many young people with misconceptions about what physics is. The campaign works with schools, educators, parents, opinion-formers and politicians to create an environment where the message every young person hears is that physics is for you; people like you study physics and you can do well and thrive.

n

The IOP represents, supports and celebrates members as well as fosters their career development and offers networking opportunities, awards and lectures. We get involved in these issues because we have a duty of care for the future of our community of physicists. It is why we argue for greater government resources for R&D as well as for the UK to rejoin the Europe Union’s Horizon Europe scheme and why we continue to raise the issue of the shortage of physics teachers in our schools. But research funding, extra teachers, generous grant funding packages and national plans can only get you so far. We also need a pipeline of bright, engaged young people from all backgrounds and life experiences, who see physics as the right choice for them and choose science as a way to make a mark on the world.

n

When young people are deterred from studying physics, which still happens far too often, they are missing out on the many benefits it brings. They are denied the opportunity to explore how their world works and to contribute to shaping the future as informed citizens, as well as losing the opportunity to play a role in the technological and scientific challenges of our age.

n

That’s why we will continue to campaign and why we will ask, politely but firmly, for the media to “Bin the boffin”.

n

The post Why the Institute of Physics launched a campaign to get the media to ‘Bin the boffin’ appeared first on Physics World.

n

]]>
https://hadamard.com/c/why-the-institute-of-physics-launched-a-campaign-to-get-the-media-to-bin-the-boffin/feed/ 0 120
New chip architecture offers hope for scaling up superconducting qubit arrays https://hadamard.com/c/new-chip-architecture-offers-hope-for-scaling-up-superconducting-qubit-arrays/ https://hadamard.com/c/new-chip-architecture-offers-hope-for-scaling-up-superconducting-qubit-arrays/#respond Fri, 17 Nov 2023 09:28:50 +0000 https://physicsworld.com/?p=111341 Continue reading New chip architecture offers hope for scaling up superconducting qubit arrays]]> Scientists in the US have introduced an ingenious new quantum chip architecture that significantly reduces disturbances caused by the signals used to control superconducting quantum bit (qubit) circuits. Led by Chuan Hong Liu and Robert McDermott of the University of Wisconsin, the team showed that the new multichip module (MCM) reduces gate errors by nearly a factor of 10 compared to earlier designs that used the same control system, making it a viable competitor to standard technologies.

n

Of the many physical systems researchers are exploring as potential  “building blocks” for a scalable quantum computer, the superconducting qubit stands out due to its high coherence time (a measure of how long it remains in a quantum state) and fidelity (a measure of how error-free its operations are). But as powerful as superconducting quantum computing can be, unlocking its full potential will require more than 1 million physical qubits. This presents a challenge, as the superconducting qubit system demands bulky cryogenic coolers and sophisticated microwave control apparatus to operate.

nn

One way of simplifying this control apparatus would be to control the qubits using the smallest units of magnetic field – flux quanta – instead of microwaves. Quantum gates based on this single flux quantum (SFQ) digital logic technology, as it is known, use a sequence of quantized flux pulses with an inter-pulse timing precisely calibrated to the qubit’s oscillation period. This method is energy efficient, compact and capable of high-speed operations, making it an ideal candidate for integration into multiqubit circuits.

n

A poisonous problem

n

The problem is that the SFQ circuit must be placed close to the qubits, which inevitably leads to a phenomenon called quasiparticle poisoning during pulse generation. This quasiparticle poisoning induces undesired relaxations, excitations and disruptions in the superconducting circuit, diminishing the qubit’s lifespan.

n

To circumvent this challenge, Liu and colleagues adopted the MCM architecture. In this setup, the SFQ driver and the qubit circuits reside on separate chips. These chips are stacked on top of each other with a 6.4 micrometre gap in between and are bonded together using interconnections known as In-bumps. The physical separation between the two chips offers several advantages. It mainly acts as a barrier, preventing quasiparticles from dissipating directly from the SFQ driver to the qubit. Additionally, it prevents another source of disturbances – phonons, which are atomic or molecular vibrations – from travelling through the material, as the In-bump bonds offer a sort of resistance to their propagation. Thanks to this resistance, these vibrations are effectively scattered and prevented from reaching the qubit chip.

n

Order of magnitude improvement

n

In initial trials of SFQ digital logic using an on-chip design, the average qubit gate error was 9.1%. Thanks to the MCM, Liu and McDermott’s team lowered this to 1.2% – nearly an order of magnitude improvement.

nn

As a future objective, the Wisconsin researchers and their colleagues at Syracuse University, the National Institute of Standards and Technology, the University of Colorado and Lawrence Livermore National Laboratory aim to further reduce the sources of quasiparticle poisoning. By experimenting with other suitable designs and further optimizing the SFQ pulse trains, the team say it may be possible to reduce gate errors to as low as 0.1% or even 0.01%, making SFQ a promising path toward achieving scalability in superconducting qubits and unlocking the exponential computing power of fault-tolerant quantum computers.

n

The research is published in PRX Quantum.

n

The post New chip architecture offers hope for scaling up superconducting qubit arrays appeared first on Physics World.

n

]]>
https://hadamard.com/c/new-chip-architecture-offers-hope-for-scaling-up-superconducting-qubit-arrays/feed/ 0 127
New superconducting nanowire single-photon detector has 400,000 pixels https://hadamard.com/c/new-superconducting-nanowire-single-photon-detector-has-400000-pixels/ https://hadamard.com/c/new-superconducting-nanowire-single-photon-detector-has-400000-pixels/#respond Thu, 16 Nov 2023 08:30:43 +0000 https://physicsworld.com/?p=111329 Continue reading New superconducting nanowire single-photon detector has 400,000 pixels]]> Single-photon detector

n

The highest resolution to date in a superconducting nanowire single-photon detector (SNSPD) camera has been claimed by researchers in the US. Designed by a team at the National Institute of Standards and Technology (NIST) and NASA’s Jet Propulsion Laboratory, the camera offers a pixel count some 400 times higher than other state-of-the-art designs, without sacrificing any of their advantages.

n

First demonstrated two decades ago, SNSPDs have transformed our ability to capture images at extremely low light levels. They feature square-grid arrays of intersecting nanowires cooled to just above absolute zero. Each wire carries an electrical current at just below the critical current at which superconductivity is destroyed.

n

When a nanowire is struck by a single photon, the heat it absorbs will temporarily shut down the superconductivity until the energy has dissipated. This causes the current to be shunted to small resistive heating elements positioned at the nearest intersections between perpendicular nanowires – each connected to their own separate readout lines. The signals from these readouts act as individual pixels, indicating each photon’s location of detection.

n

“SNSPDs have some very appealing characteristics,” explains team leader Bakhrom Oripov at NIST. “They work for any [photon] wavelength up to 29 mm (not true for many other silicon technologies) and have demonstrated detection efficiencies of 98% at 1550 nm. They also have very low uncertainties in photon arrival times (timing jitter) and have extremely low false detection rates (dark counts).”

n

Resolution limitations

n

Despite these advantages, the need for independent readout wires for each pixel has made it difficult to scale-up SNSPDs to create larger detectors. So far, this has meant that even the highest-resolution devices have little more than 1000 pixels.

nn

Oripov’s team took a different approach to detector design and this allowed them to detect photons using readout lines arranged parallel to the nanowires in each row and column.

n

“Instead of using direct electrical signal readout from detectors, we first transduce that electrical signal into heat in the readout line (generated by a resistive heating element) and use it to trigger counter-propagating electrical pulses in the readout line,” Oripov explains.

n

By comparing the arrival times of these pulses at each end of a readout line, the camera can then pinpoint precisely where along the nanowire the photon was absorbed. In this way, a pixel is generated at the point where the photon absorption site detected in one row intersects with a detection in a perpendicular column.

n

Fewer readout lines

n

In contrast with previous designs – where a total of N2 readout lines were required to monitor an array of N×N nanowires – this new design can build up single-photon images with just 2N readout lines.

n

As Oripov describes, this improvement will make it vastly easier for the team to improve resolution in their design. “We showed we can indeed scale to large number of pixels without sacrificing other properties such as single photon sensitivity, readout jitter and dark count,” he says.

nn

Their device achieved a pixel count of 400,000 – some 400 times higher than existing state-of-the-art designs. But with further improvements, they are confident that this number could be increased. If achieved, this would pave the way for a new generation of large-scale SNSPDs, suitable for single-photon imaging across a broad band of the electromagnetic spectrum.

n

Already, Oripov envisages a diverse range of possibilities for the new technology: from improved astronomy techniques for investigating dark matter and mapping the early universe, to new opportunities for quantum communications and medical imaging.

n

“It seems like with this result, we got the attention of a few astrophysicists and biomedical imaging people, all interested in collaborating and making better imaging tools,” he says. “That’s certainly an exciting moment both for our team and our colleagues in the field of SNSPD research in general.”

n

The new detector is described in Nature.

n

The post New superconducting nanowire single-photon detector has 400,000 pixels appeared first on Physics World.

n

]]>
https://hadamard.com/c/new-superconducting-nanowire-single-photon-detector-has-400000-pixels/feed/ 0 130
Adaptive optics pioneers win Rank Prize for retinal imaging breakthroughs https://hadamard.com/c/adaptive-optics-pioneers-win-rank-prize-for-retinal-imaging-breakthroughs/ https://hadamard.com/c/adaptive-optics-pioneers-win-rank-prize-for-retinal-imaging-breakthroughs/#respond Tue, 14 Nov 2023 13:00:12 +0000 https://physicsworld.com/?p=111247 Continue reading Adaptive optics pioneers win Rank Prize for retinal imaging breakthroughs]]> The Rank Prize winners

n

Four scientists who pioneered the development of adaptive optics (AO) technologies for imaging the human retina have been awarded the 2024 Rank Prize for Optoelectronics. The winners – Junzhong Liang, Donald Miller, Austin Roorda and David Williams – invented instruments that use AO to capture high-resolution images of the living retina and provide new insight into the structure and function of the human eye.

n

AO was originally developed for use in astronomy, to eliminate atmosphere-induced blur in images from ground-based telescopes. It works by measuring distortions in a reflected wavefront using a wavefront sensor, and then compensating for these distortions with a wavefront corrector, which is often a deformable mirror.

n

In 1997, Liang, Williams and Miller demonstrated that AO can also be used to correct for distortions caused by imperfect optics within the human eye. Using AO, they created a retinal imaging camera with unprecedented resolution, enabling clear imaging of individual photoreceptor cells in the living human retina. Two years later, Roorda and Williams used this instrument to produce the first-ever images showing the distribution of the three types of cones in the human retina.

n

According to Donal Bradley, chair of the Rank Prize Optoelectronics Committee, the prize recognizes the winners “seminal contribution to imaging within the eye that opens new opportunities to understand this complex optical instrument and to improve eyesight through precise interventions”. Tami Freeman spoke to two of the winners to find out more.

n

Since its invention, how has AO impacted the field of eye imaging?

n

Donald Miller AO is the only technology that allows the visualization of individual retinal cells in a living eye. And because disease and pathology start at this cellular level, that’s the level we ultimately want clinicians to operate at, for earlier diagnosis and more effective treatments.

n

As one example from my own lab, we’ve recently been looking at the impact of glaucoma, one of the leading causes of irreversible blindness in the world, on retinal ganglion cells – the primary cell type that dies in this disease and which line the top of the retina. While effective treatments exist, the disease is unfortunately hard to diagnose early until significant damage has occurred. With AO, we can now, for the first time, monitor individual retinal ganglion cells and track them over time in these patients.

nn

Using AO combined with optical coherence tomography (AO-OCT), we have found that, even in eyes under treatment, we see subclinical loss of cells. That’s important because clinicians can now use these cellular-level measurements to better establish whether or not their treatment is working. It also offers considerable potential for testing the efficacy and safety of new neuroprotective and regenerative strategies. The visualization of retinal ganglion cells in human subjects has only become possible within the last few years – we are entering a really exciting time.

n

Austin Roorda As treatments become available for the major blinding eye diseases, like diabetes, glaucoma and macular degeneration, we can now use AO to assess how effective they are. But there are other inherited retinal diseases due to gene mutations for which very little is known. In those rare diseases, previously the only way to see what was happening on a cellular scale was to wait for a donor eye and look at it under a microscope. AO has opened up the ability to examine the retina on a microscopic scale in these patients. Treatments such as gene therapy are on the horizon that could potentially cure or halt these inherited diseases. AO is poised to play a key role in that process – to understand how the mutation affects the retina, assess the state of the retina, predict the prognosis if the patient undergoes gene therapy, and then measure the effectiveness of that therapy.

n

How has AO technology progressed over the last 25 years?

n

AR AO was originally constrained by the technology that was available, which was largely developed for the field of astronomy. So the deformable mirror was big and wasn’t suited to the eye. Over the years, when companies started recognising the potential of AO in other fields, including ophthalmoscopy, they started to build wavefront sensing devices and wavefront correctors (the deformable mirror) that were a lot better suited to applications in the human eye.

n

DM When we first developed the AO system, we made a lot of guesses: what type of wavefront correction to use, what wavefront sensor, the loop speed and so on. In the next five to 10 years there were a lot of improvements in our understanding of the spatial properties and temporal dynamics of ocular aberrations. These then defined the AO components: how many actuators you need in your wavefront corrector, what the stroke [actuator displacement] should be, how many sampling points you need across the pupil, and how fast the AO system should go. Those have all been optimized over the years.

n

The first AO system for the eye

n

For example, the wavefront corrector we used in 1997 had 37 actuators that push and pull on the back surface of the mirror to warp its shape, and it would give four microns of stroke. The ones used today have close to 100 actuators and give an order of magnitude more stroke, which is important because the eyes have severe aberrations; that’s made a big difference.

n

AR Now, when you use AO, you push a button and it runs automatically at anywhere from tens to hundreds of hertz. Before, we had to take a picture, a map of the of the eye’s aberrations, and scrutinize it to make sure there weren’t any errors in the initial image analysis. Then you would push the next button to apply that shape to the mirror. So the user was an integral part of the closed-loop AO system. It was fun, but it was slow.

n

Initially, Don, David and Junzhong built a standard flood-illumination camera that would look at the retina through an AO system to reveal the microscopic structure. Later, I incorporated AO into a scanning system to create an AO scanning laser ophthalmoscope (AOSLO) that can record video of the retina and perform depth sectioning. That’s an entirely new AO imaging platform. Other researchers have incorporated a type of phase contrast imaging that can visualize otherwise transparent cells in the retina, and in David’s group they are performing fluorescence imaging in animal eyes.

n

What’s your current main area of research?

n

AR If there was a theme for what I’ve been doing for the last 15 years or so, it’s structure and function. It turns out that our AOSLO imager is also the world’s best eye tracker. You can track eye motion very quickly and accurately because you can see the movement of single cells in the back of the eye. We took this a step further, using the scanning laser system not only to image the retina, but to control the placement of images onto the retina on the scale of a single cone.

n

Austin Roorda working on the AOSLO

n

We’ve been measuring functional properties in living humans. If you were in the device, I could deliver flashes of light into individual cones and ask whether you could see them or what colour you see. Early on, we mapped the cone mosaic, that was one of the big AO-enabled discoveries. Now we can take that cone mosaic and start asking questions about basic retinal circuits or the fundamental properties of human colour vision. We’re doing the same in eye disease. If we look at an array of cells in a patient and it doesn’t look normal, we’re interested in the functional consequences – not just seeing the structure of that diseased retina but asking about the visual outcomes.

n

DM We’re also focused on structure and function, but using AO-OCT. The big advantage of OCT is its axial resolution, which lets you section out whatever depth in the retina layer you want to visualize. Cones are very bright and high in contrast, but other cells tend to be much harder to image as they reflect a lot less light back. We’ve made quite a bit of headway using AO-OCT to image these other neurons in the retina at different depths. It was a big step to be able to image retinal ganglion cells, as they are highly transparent and have very low contrast.

n

We’ve also been using AO-OCT to look at function within photoreceptors. In 2000, Austin and David had developed their pioneering AO retinal densitometry method for cone classification. Twenty years later, we can use the phase information provided by AO-OCT to measure subtle changes in the elongation of these photoreceptor cells when stimulated by different colours of light. That turned out to be a much more accurate and far less time-consuming way to do cone classification and is a good example of the evolution of AO imaging technology.

n

How do you see the field of AO evolving in the future?

n

AR In my lab, we focus a lot on subjective measures of function, such as eye movements, acuity and colour vision. But I would envision that as AO techniques evolve, we’ll be able to measure functional properties of most cell classes in the retina. Right now, Don has generated beautiful images of ganglion cells using AO-OCT. These are the last cells before the signals from the retina reach the brain, so it’s a class of neurons whose function we’re very interested in. Using phase methods, or methods we can’t even conceive of right now, we may be able to measure the functional properties of those and other neurons in the retina.

nn

David, Don and I are immersed in basic research, but there are a lot of other people thinking about how to get these systems into the clinic. AO is not easy and it’s not cheap, it’s a complicated technology so the path to the clinic is not easy. There are a few companies now that will sell AO imaging devices, but they’re not used routinely by any stretch.

n

DM The field of AO waxes and wanes between trying to improve AO performance versus making AO more accessible and commercially viable. In our labs, we’re trying to achieve the very best performance, correcting aberrations and getting sharper images for research or clinical purposes. But there’s a whole other side pushing this technology to make it more compact, cheaper and more automated. The real potential is marrying AO with SLO and OCT for commercial use. I think that it’s just a matter of time.

n

    n

  • Founded in 1972 by the British industrialist and philanthropist Lord J Arthur Rank, the Rank Prize is awarded biennially in the fields of nutrition and optoelectronics. The Prize will be awarded formally on 1 July 2024.
  • n

n

The post Adaptive optics pioneers win Rank Prize for retinal imaging breakthroughs appeared first on Physics World.

n

]]>
https://hadamard.com/c/adaptive-optics-pioneers-win-rank-prize-for-retinal-imaging-breakthroughs/feed/ 0 136
What the movie Oppenheimer can teach today’s politicians about scientific advice https://hadamard.com/c/what-the-movie-oppenheimer-can-teach-todays-politicians-about-scientific-advice/ https://hadamard.com/c/what-the-movie-oppenheimer-can-teach-todays-politicians-about-scientific-advice/#respond Mon, 13 Nov 2023 14:31:55 +0000 https://physicsworld.com/?p=110831 Continue reading What the movie Oppenheimer can teach today’s politicians about scientific advice]]> One of the scariest moments in Robert Oppenheimer’s career was not shown in Oppenheimer, the recent blockbuster movie. That moment occurred during the contentious hearing on his security clearance, which concerned Oppenheimer’s role on a committee to advise the US government on nuclear weapons. Ostensibly about his loyalty to America, the hearing also revealed deeper concerns about his left-wing sympathies and opposition to an early project to build a hydrogen bomb.

n

Near the end of the hearing is a moment not included in the movie but whose terrifying implications still reverberate

n

n

Oppenheimer never regretted his leadership of the Manhattan Project, which built the atomic bombs that were dropped on Hiroshima and Nagasaki in 1945. But he feared that without international control of atomic weapons, developing the (much bigger) hydrogen bomb would trigger an arms race. Political enthusiasts of the hydrogen bomb then revoked his clearance, prompting the hearing, in which the experienced and unscrupulous attorney Roger Robb was appointed to interrogate Oppenheimer.

n

All this is skilfully depicted in the movie. Near the end of the hearing, however, is a moment that is not included and whose terrifying implications still reverberate. It’s when Robb, in an apparent non sequitur, suddenly quizzes Oppenheimer about John Ericsson, the Swedish-American naval architect who designed ships for the US government during the American Civil War almost a century earlier. Robb asks Oppenheimer whether the fact that Ericsson had designed and built the Monitor, the first ironclad battleship, qualified him to plan naval strategy.

nn

Flummoxed by the bizarre twist, Oppenheimer says “No”. Robb then springs his trap. “Doctor” – superficially appearing to show respect for Oppenheimer’s credentials – “do you think now that perhaps you went beyond the scope of your proper function as a scientist in undertaking to counsel in matters of military strategy and tactics?” Robb was sly, smoothly but falsely equating Ericsson’s qualifications to “plan” military strategy with Oppenheimer’s to “counsel” it, implying that both were equally invalid.

n

At that moment, Robb was not out just to silence Oppenheimer as a government adviser. That wasn’t necessary; Oppenheimer’s consultancy contract could have simply been cancelled or left to expire – which, ironically, it did at the end of June 1954, one day after his security clearance was stripped at the end of the trial. Robb was after bigger game, which was to prevent any scientist from advising politicians on government policy.

n

Robb was effectively saying that Ericsson knew how to make boats and Oppenheimer knew how to build bombs – but only politicians and military leaders know how to use them. Keeping the two separate is, Robb believed, the right way for the government to run things.

n

Though the hearing was a kangaroo court, Oppenheimer could easily have challenged Robb’s argument

n

n

Robb had rewritten history, for Ericsson had in fact both build boats and advised how to use them. Ericsson had advised the Secretary of the Navy on strategies for using “little” and “big” ironclads. He wrote of strategies for defending cities on the Atlantic coast and for future wars. He wrote to President Abraham Lincoln and testified before Congress. Sometimes his advice was taken and sometimes not, but the Union benefitted from it.

n

n

Though the hearing was a kangaroo court, Oppenheimer could easily have challenged Robb’s argument. In fact, he had started to craft a response on the hearing’s first day, mentioning that an obstacle to scientific advisers was that politicians tended to regard them as academics who were “pleading a special interest”. As Oppenheimer added: “We did plead a special interest, but we believed it to be in the national interest, too.”

nn

But he did not get to develop this question of how a nation’s interests can profit from scientists’ special interests because  his interrogation was quickly switched to his associations, honesty and loyalty. Had Oppenheimer done so, he would have outlined a plan to have scientifically sensitive politicians and politically sensitive scientists mutually evaluate potential courses of action. There’s no magic trick to make this happen, but arguing why it’s necessary is a start.

n

Oppenheimer thought that it would happen in America; Robb was out to make sure it wouldn’t. He was afraid that scientific advisers would attempt to intimidate politicians, giving them rules to follow. Scientists develop tools, politicians use them, Robb insisted. Politicians have the right to ignore scientific advice and decide courses of action based solely on their own interests.

n

The critical point

n

Today, almost 70 years after that exchange, we need to make the case that Oppenheimer was never able to. Our adversaries of scientific advice charge scientists not with disloyalty but conspiracy, and are captivated not by big bombs but fossil-fuel interests. Some not only admit that they are ignoring scientific advice but campaign on it. We have no magic tricks either, only elections. But without such advice, politicians are blindfolding themselves, discharging weapons without any clear idea of what they are shooting or hitting.

n

Robb’s dangerous vision is that claimed by many politicians today – that they have the right to ignore things such as what climatologists have to say about global warming or what epidemiologists have to say about pandemics. It’s understandable why the makers of Oppenheimer did not include that moment, for that movie is a drama. Ours is a horror show.

n

The post What the movie <em>Oppenheimer</em> can teach today’s politicians about scientific advice appeared first on Physics World.

n

]]>
https://hadamard.com/c/what-the-movie-oppenheimer-can-teach-todays-politicians-about-scientific-advice/feed/ 0 141
Neutral-atom quantum computers are having a moment https://hadamard.com/c/neutral-atom-quantum-computers-are-having-a-moment/ https://hadamard.com/c/neutral-atom-quantum-computers-are-having-a-moment/#respond Thu, 09 Nov 2023 14:26:45 +0000 https://physicsworld.com/?p=111184 Continue reading Neutral-atom quantum computers are having a moment]]> In the race for the quantum computing platform of the future, neutral atoms have been a bit of an underdog. While quantum bits (qubits) based on neutral atoms have several attractive characteristics, including the ease of scaling up qubit numbers and performing operations on them in parallel, most attention has focused on rival platforms. Many of the largest machines are built with superconducting qubits, including those developed at IBM, Google, Amazon, and Microsoft. Other companies have opted for ions, like Honeywell and IonQ, or photons, like Xanadu.

n

In the past few weeks, though, several eye-catching developments have pushed neutral atoms towards the front of the pack. One of them came from a start-up called Atom Computing, which announced in late October that it will soon have a 1000-qubit neutral-atom machine ready for customers – the first commercial quantum device to pass this milestone. The others came from three teams of researchers who published separate studies in Nature describing neutral-atom platforms with low noise, new error mitigation capacities and strong potential for scaling up to even larger numbers of qubits.

nn

For any qubit platform, the biggest barriers to robust quantum operations are noise and the errors it causes. “Error correction is really the frontier of quantum computing,” says Jeff Thompson, a physicist at Princeton University, US who led one of the three studies together with Shruti Puri of Yale University, US. “It’s the thing that’s standing in between us and actually doing useful calculations.”

n

The reason error correction is so important is that it makes computations possible even if the underlying hardware is prone to noise. Classical computers use a simple error correction strategy called a repetition code: store the same information multiple times so that if there’s an error in one bit, the “majority vote” of the remaining bits will still point to the correct value. Quantum error correction algorithms are essentially more complex versions of this, but before a platform can benefit from them, their hardware must meet some minimal fidelity requirements. For traditional quantum algorithms, the rule of thumb is that the error rate for the minimum unit of quantum computation – a quantum gate – should be below 1%.

n

Bringing down the noise

n

Researchers led by Mikhail Lukin of Harvard University, US, are now reporting that their neutral-atom quantum computer has met that threshold, achieving an error rate of 0.5%. They reached this milestone by implementing two-qubit gates in a way pioneered by teams in Germany and France, and their machine, which they developed with colleagues at the neighbouring Massachusetts Institute of Technology (MIT) and QuEra Computing, works as follows.

n

First, a vapour of rubidium atoms is cooled to just above absolute zero. Then, individual atoms are captured and held by tightly focused laser beams in a technique known as optical tweezing. Each atom represents a single qubit, and hundreds are arranged in a two-dimensional array. The quantum information in these qubits – a zero or one or a quantum superposition of the two – is stored in two different energy levels of the rubidium atoms.

n

To perform a two-qubit gate, two atoms are brought near each other and simultaneously illuminated by a laser. The illumination promotes one of the atom’s electrons to a high energy level known as a Rydberg state. Once in this state, atoms easily interact with their near neighbours, making the gate operation possible.

n

To improve the fidelity of the operation, the team used a recently developed optimized pulse sequence for exciting the two atoms to the Rydberg state and bringing them back down. This pulse sequence is faster than previous versions, giving the atoms less chance to decay into the wrong state, which would break the calculation. Combining this with other technical improvements allowed the team to reach 99.5% fidelity for two-qubit gates.

n

Although other platforms have achieved comparable fidelities, neutral-atom quantum computers can do more computations in parallel. In their experiment, Lukin and his team applied their two-qubit gate to 60 qubits at once simply by illuminating them with the same laser pulse. “This makes it very, very special,” Lukin says, “because we can have high fidelities and we can do it in parallel with just a single global control. No other platform can actually do that.”

n

Erasing errors

n

An artist's drawing of five spheres in a line. The spheres represent atoms; four of the atoms are yellow, while one of them glows pink

n

While Lukin’s team optimized their experiment to meet the fidelity threshold for applying error correction schemes, Thompson and Puri, together with colleagues at the University of Strasbourg, France, found a way to convert certain kinds of errors to erasures, removing them from the system altogether. This makes these errors much easier to correct, lowering the threshold for error-correction schemes to work.

n

Thompson and Puri’s setup is similar to that of the Harvard-MIT team, with individual ultracold atoms held in optical tweezers. The main difference is that they used ytterbium atoms instead of rubidium. Ytterbium has a more complicated energy-level structure than rubidium, which makes it more difficult to work with but also provides more options for encoding quantum states. In this case, the researchers encoded the “zero” and “one” of their qubits in two metastable states, rather than the traditional lowest two energy levels. Although these metastable states have shorter lifetimes, many of the possible error mechanisms would bump the atoms out of these states and into the ground state, where they can be detected.

n

Being able to delete errors is a big boon. Classically, if more than half the bits in a repetition code have errors, the wrong information will be transmitted. “But with the erasure model, it’s much more powerful because now I know which bits have had an error, so I can exclude them from the majority vote,” Thompson explains. “So all I need is for there to be one good bit left.”

n

Thanks to their erasure conversion technique, Thompson and colleagues were able to detect about a third of the errors in real time. Though their two-qubit gate fidelity of 98% is less than that of the Harvard-MIT team’s machine, Thompson notes that they used almost 10 000 times less laser power to drive their gate, and increasing the power will boost the performance while also allowing a larger fraction of errors to be detected. The error erasure technique also lowers the threshold for error correction to below 99%; in a scenario where almost all errors are converted to erasures, which Thompson says should be possible, the threshold could be as low as 90%.

n

Multiplexing error erasure

n

In a related result, researchers at the California Institute of Technology, US (Caltech) also converted errors to erasures. Their strontium-based neutral atom machine is a more restricted kind of quantum computer known as a quantum simulator: while they can excite atoms up to the Rydberg state and create entangled superpositions between the ground and Rydberg states, their system has only one ground state, which means they cannot store quantum information long-term.

nn

However, they created these entangled superpositions with unprecedented fidelity: 99.9%. They also made a huge superposition consisting of not just two atoms, but 26, and improved the fidelity of doing so by erasing some of the errors. “We basically show that you could meaningfully bring this technique into the realm of the many-body,” says Adam Shaw, a PhD student in Manuel Endres’ group at Caltech.

n

Together, the three advances show off the capabilities of neutral-atom quantum computers, and the researchers say their ideas can be combined into a machine that works even better than the ones demonstrated thus far. “The fact that all these works came out together, it’s a little bit of a sign that something special is about to come,” Lukin concludes.

n

The post Neutral-atom quantum computers are having a moment appeared first on Physics World.

n

]]>
https://hadamard.com/c/neutral-atom-quantum-computers-are-having-a-moment/feed/ 0 147
Physicists trap electrons in a 3D crystal for the first time https://hadamard.com/c/physicists-trap-electrons-in-a-3d-crystal-for-the-first-time/ https://hadamard.com/c/physicists-trap-electrons-in-a-3d-crystal-for-the-first-time/#respond Wed, 08 Nov 2023 16:00:00 +0000 https://news.mit.edu/2023/physicists-trap-electrons-3d-crystal-first-time-1108 Continue reading Physicists trap electrons in a 3D crystal for the first time]]> <p>Electrons move through a conducting material like commuters at the height of Manhattan rush hour. The charged particles may jostle and bump against each other, but for the most part they’re unconcerned with other electrons as they hurtle forward, each with their own energy.</p>nn<p>But when a material’s electrons are trapped together, they can settle into the exact same energy state and start to behave as one. This collective, zombie-like state is what’s known in physics as an electronic “flat band,” and scientists predict that when electrons are in this state they can start to feel the quantum effects of other electrons and act in coordinated, quantum ways. Then, exotic behavior such as superconductivity and unique forms of magnetism may emerge.</p>nn<p>Now, physicists at MIT have successfully trapped electrons in a pure crystal. It is the first time that scientists have achieved an electronic flat band in a three-dimensional material. With some chemical manipulation, the researchers also showed they could transform the crystal into a superconductor — a material that conducts electricity with zero resistance.</p>nn<p>The electrons’ trapped state is possible thanks to the crystal’s atomic geometry. The crystal, which the physicists synthesized, has an arrangement of atoms that resembles the woven patterns in “kagome,” the Japanese art of basket-weaving. In this specific geometry, the researchers found that rather than jumping between atoms, electrons were “caged,” and settled into the same band of energy.</p>n<img alt=”Animation of spinning 3D crystal structure that looks like a star made up of latticed cubes and pyramids.” data-align=”center” data-caption=”The rare electronic state is thanks to a special cubic arrangement of atoms (pictured) that resembles the Japanese art of “kagome.”&lt;br /&gt;n&lt;br /&gt;nImage: Courtesy of the researchers” data-entity-type=”file” data-entity-uuid=”9acd79e3-67c5-4219-bb86-369360f57068″ src=”/sites/default/files/images/inline/3D-trap.gif” />n<p>The researchers say that this flat-band state can be realized with virtually any combination of atoms — as long as they are arranged in this kagome-inspired 3D geometry. The <a href=”https://www.nature.com/articles/s41586-023-06640-1″ target=”_blank”>results</a>, appearing today in <em>Nature</em>, provide a new way for scientists to explore rare electronic states in three-dimensional materials. These materials might someday be optimized to enable ultraefficient power lines, supercomputing quantum bits, and faster, smarter electronic devices.</p>nn<p>“Now that we know we can make a flat band from this geometry, we have a big motivation to study other structures that might have other new physics that could be a platform for new technologies,” says study author Joseph Checkelsky, associate professor of physics.</p>nn<p>Checkelsky’s MIT co-authors include graduate students Joshua Wakefield, Mingu Kang, and Paul Neves, and postdoc Dongjin Oh, who are co-lead authors; graduate students Tej Lamichhane and Alan Chen; postdocs Shiang Fang and Frank Zhao; undergraduate Ryan Tigue; associate professor of nuclear science and engineering Mingda Li; and associate professor of physics Riccardo Comin, who collaborated with Checkelsky to direct the study; along with collaborators at multiple other laboratories and institutions.</p>nn<p><strong>Setting a 3D trap</strong></p>nn<p>In recent years, physicists have successfully trapped electrons and confirmed their electronic flat-band state in two-dimensional materials. But scientists have found that electrons that are trapped in two dimensions can easily escape out the third, making flat-band states difficult to maintain in 2D.</p>nn<p>In their new study, Checkelsky, Comin, and their colleagues looked to realize flat bands in 3D materials, such that electrons would be trapped in all three dimensions and any exotic electronic states could be more stably maintained. They had an idea that kagome patterns might play a role.</p>nn<p>In <a href=”https://news.mit.edu/2018/physicists-discover-new-quantum-electronic-material-0319″ target=”_blank”>previous work</a>, the team observed trapped electrons in a two-dimensional lattice of atoms that resembled some kagome designs. When the atoms were arranged in a pattern of interconnected, corner-sharing triangles, electrons were confined within the hexagonal space between triangles, rather than hopping across the lattice. But, like others, the researchers found that the electrons could escape up and out of the lattice, through the third dimension.</p>nn<p>The team wondered: Could a 3D configuration of similar lattices work to box in the electrons? They looked for an answer in databases of material structures and came across a certain geometric configuration of atoms, classified generally as a pyrochlore — a type of mineral with a highly symmetric atomic geometry. The pychlore’s 3D structure of atoms formed a repeating pattern of cubes, the face of each cube resembling a kagome-like lattice. They found that, in theory, this geometry could effectively trap electrons within each cube.</p>nn<p><strong>Rocky landings</strong></p>nn<p>To test this hypothesis, the researchers synthesized a pyrochlore crystal in the lab.</p>nn<p>“It’s not dissimilar to how nature makes crystals,” Checkelsky explains. “We put certain elements together — in this case, calcium and nickel — melt them at very high temperatures, cool them down, and the atoms on their own will arrange into this crystalline, kagome-like configuration.”</p>nn<p>They then looked to measure the energy of individual electrons in the crystal, to see if they indeed fell into the same flat band of energy. To do so, researchers typically carry out photoemission experiments, in which they shine a single photon of light onto a sample, that in turn kicks out a single electron. A detector can then precisely measure the energy of that individual electron.</p>nn<p>Scientists have used photoemission to confirm flat-band states in various 2D materials. Because of their physically flat, two-dimensional nature, these materials are relatively straightforward to measure using standard laser light. But for 3D materials, the task is more challenging.</p>nn<p>“For this experiment, you typically require a very flat surface,” Comin explains. “But if you look at the surface of these 3D materials, they are like the Rocky Mountains, with a very corrugated landscape. Experiments on these materials are very challenging, and that is part of the reason no one has demonstrated that they host trapped electrons.”</p>nn<p>The team cleared this hurdle with angle-resolved photoemission spectroscopy (ARPES), an ultrafocused beam of light that is able to target specific locations across an uneven 3D surface and measure the individual electron energies at those locations.</p>nn<p>“It’s like landing a helicopter on very small pads, all across this rocky landscape,” Comin says.</p>nn<p>With ARPES, the team measured the energies of thousands of electrons across a synthesized crystal sample in about half an hour. They found that, overwhelmingly, the electrons in the crystal exhibited the exact same energy, confirming the 3D material’s flat-band state.</p>nn<p>To see whether they could manipulate the coordinated electrons into some exotic electronic state, the researchers synthesized the same crystal geometry, this time with atoms of rhodium and ruthenium instead of nickel. On paper, the researchers calculated that this chemical swap should shift the electrons’ flat band to zero energy — a state that automatically leads to superconductivity.</p>nn<p>And indeed, they found that when they synthesized a new crystal, with a slightly different combination of elements, in the same kagome-like 3D geometry, the crystal’s electrons exhibited a flat band, this time at superconducting states.</p>nn<p>“This presents a new paradigm to think about how to find new and interesting quantum materials,” Comin says. “We showed that, with this special ingredient of this atomic arrangement that can trap electrons, we always find these flat bands. It’s not just a lucky strike. From this point on, the challenge is to optimize to achieve the promise of flat-band materials, potentially to sustain superconductivity at higher temperatures.”</p>

]]>
https://hadamard.com/c/physicists-trap-electrons-in-a-3d-crystal-for-the-first-time/feed/ 0 236
Size matters: the economies of scale, from the very big to the very small https://hadamard.com/c/size-matters-the-economies-of-scale-from-the-very-big-to-the-very-small/ https://hadamard.com/c/size-matters-the-economies-of-scale-from-the-very-big-to-the-very-small/#respond Wed, 08 Nov 2023 10:00:42 +0000 https://physicsworld.com/?p=110807 Continue reading Size matters: the economies of scale, from the very big to the very small]]> One hundred and fifty years ago in the US, the combined power of four strong horses harnessed to a plough was no more than 3 kilowatts – and more than half the entire labour force worked on farms. Today, even the smallest John Deere tractor produces 120 kilowatts and about 1.3% of workers are employed in agriculture. This means that the power rating of farm machinery has been multiplied by 40 and the rural workforce has been divided by 40 in that time – all of which is hardly a coincidence.

n

In transport, both modern ships and large airliners can generate up to 90 megawatts. That makes them nearly 1000 times more powerful than a typical small car and 100,000 times more powerful than a 19th-century waterwheel. And these are not even the most powerful machines out there: some steam engines that generate electricity now operate at 1000 megawatts. Setting aside the environmental impact, these enormous increases in power have brought a surfeit of food and affordable consumer items to a largely urban society, with increased access to information and mobility.

n

The Czech–Canadian scientist and policy maker Vaclav Smil analyses this growth of scale and power in his latest book Size: How It Explains the World. Smil is a prolific author who has already published more than 40 books on topics ranging from energy and food production to technical innovation and public policy. In this work, he revisits some of those territories, linking them together with a discussion of size, though it might be more accurate to say that the book is about proportion.

n

Size is not as trivial as simply a list of things that have become very big over the centuries. The physical limitations on that growth are discussed – the size of oil tankers, for example, is often expressed in deadweight tonnage (dwt), and the sizes of these ships increased steadily from about 20,000 dwt after the Second World War to about 300,000 dwt by the 1970s. There are no engineering limitations to making them more than twice that size and yet that hasn’t happened. Smil points out that this is because only a handful of deep-water ports in the world can accommodate such mega-ships, and they would be unable to pass through either the Suez or Panama canals.

nn

On the opposite end of the size scale, Smil discusses Moore’s law, in which engineer Gordon Moore predicted in 1965 the rapid doubling of the number of components placed on a microchip. A graph of the number of transistors versus time shows that the law was maintained for many decades, but that there has been a slight levelling off since 2008. Smil links this tailing off to work by the US electrical engineer Robert Dennard, who showed that as transistors become smaller, they can be made to run faster without increasing the overall power consumption – but that this scaling effect had already begun to reach its limit by the 1990s. Future improvements may also be controlled by both the natural limits of lithography, the widely used light-printing technique, and by the enormous investments needed to develop a new manufacturing facility.

n

Size also goes beyond simply analysing technology. In an attempt to appreciate the human scale in all aspects of design, Smil starts with a lengthy discussion of the giants encountered in Gulliver’s Travels. We learn that – despite Jonathan Swift’s attempts to build his fictitious world with some plausibility – a modern understanding of materials reveals that his giants would have been unable to walk upright. Worse, their mass-to-surface-area ratio would have made it very difficult for them to cool themselves down, an issue mirrored in the significantly smaller Lilliputians, who would have had to eat almost constantly to maintain their body temperature.

n

The scattergun nature of the topics in the book might prevent it from ultimately presenting a cohesive thesis – but it is no less enjoyable for that. The lengthy discussion of normal distributions and how they apply to issues as diverse as income distribution, as well as the heights of basketball players, is both informative and entertaining. I also enjoyed the section on the human body and perceived attractiveness, which leads to an analysis of how we are represented in paintings. That in turn takes us to a wonderful rant about the supposed ubiquity of the “golden ratio” in art and design. Smil approaches this concept with some scepticism, concluding that the so-called ratio cannot be precisely expressed as a fraction, and is therefore not even truly a ratio.

n

All in all, I suspect that many Physics World readers would be delighted to find this book waiting for them under the Christmas tree. Indeed, it would be perfect reading material for anyone who enjoys a mathematical analysis of the world around them, and finds themselves with a little free time.

n

    n

  • 2023 Penguin 304 pp £20hb
  • n

n

The post Size matters: the economies of scale, from the very big to the very small appeared first on Physics World.

n

]]>
https://hadamard.com/c/size-matters-the-economies-of-scale-from-the-very-big-to-the-very-small/feed/ 0 151
Cool tricks offer new solutions for quantum networking https://hadamard.com/c/cool-tricks-offer-new-solutions-for-quantum-networking/ https://hadamard.com/c/cool-tricks-offer-new-solutions-for-quantum-networking/#respond Tue, 07 Nov 2023 10:53:21 +0000 https://physicsworld.com/?p=110967 Continue reading Cool tricks offer new solutions for quantum networking]]> Emerging systems for quantum communications and cryptography rely on the ability to transmit single photons with high fidelity. Single-photon emitters based on quantum dots cooled to cryogenic temperatures have been shown to produce indistinguishable single photons with high brightness, but for practical use in real-world communications networks both the single-photon source and its cooling mechanism must be integrated into a standard rack-mounted unit.

n

Scientists at TU Berlin have recently shown that this tricky integration can be achieved with a Stirling cryocooler supplied by AMETEK Sunpower. They have built a plug-and-play testbed for quantum key distribution that emits single-photon pulses at telecoms wavelengths, and that combines the quantum-dot device, the cryocooler, and all the associated optical components into a standalone 19-inch module (Appl. Phys. Rev. 9 011412).

n

Other iterations of such quantum-dot emitters would typically require a bulky and complex cooling system to enable operation at temperatures below about 50 K, but the scientists at TU Berlin found that the compact Stirling cryocooler was able to maintain the required operating temperature without introducing unwanted vibrations into the system. These cryocoolers are already widely used in scientific instruments that require a low-noise background, such as infrared and radio-wave detectors for telescopes and superconducting quantum-interference devices (SQUIDs), while recent design improvements are widening their appeal for applications with more demanding requirements.

n

The Sunpower design features a free-piston mechanism that exploits gas bearings to enable friction-free operation. “The motion of the piston is driven by an electronic controller, while the oscillation of the moving parts charges the gas bearings to enable the piston and displacer to levitate on a film of gas,” explains Cliff Fralick of AMETEK Sunpower. “There is no lubrication used, and no maintenance needed, which ensures that these hermetically-sealed cryocoolers will have a long and dependable lifetime.”

n

Such contact-free operation has made these free-piston cryocoolers a popular choice for applications that demand reliable and robust cooling solutions. One stand-out example was a device that was designed to cool an imaging spectrometer onboard NASA’s RHESSI space mission, launched in 2002 to study the energetic particles released in solar flares. Despite a target mission lifetime of just two years, the cryocooler enabled the spectrometer to continue capturing images for 16 years, until the instrument was finally decommissioned in 2018.

n

Sunpower’s free-piston design also delivers higher cooling powers and a better thermal efficiency than other cryocoolers on the market. One of the most powerful models in the company’s range of compact devices, the Cryotel GT, removes heat at a rate of 16 W with 240 W of input power while maintaining a temperature of 77 K, achieving a cooling efficiency of nearly 20% of the theoretical Carnot limit. In addition, the high specific power of the design allows for a smaller size, with the GT measuring 276 mm long and 83 mm diameter, and with a mass of only 3 kg.

n

Such design parameters have made Sunpower’s cryocooler a popular choice for instruments that need to pick up faint signals. “In order to detect something that produces very little energy, it is necessary to generate a very cold background to minimize the noise floor and improve the signal-to-noise ratio,” says Fralick. “That applies to many scientific applications such as infrared detectors, SQUIDs, low-noise amplifiers, telescope instruments, and deep-space communications.”

n

Recent innovations have now widened the appeal of Sunpower’s cryocoolers for a range of new applications, particularly within the growing field of quantum technology. The company has recently released a premium version of the GT that offers a minimum temperature around 10 K lower than the standard version, while also increasing the usable heat load capacity at temperatures between 30 to 50 K.

n

“The GT typically has a minimum temperature of approximately 38 K, while the new GTLT can provide meaningful cooling power at temperatures down to 30 K,” says Fralick. “By boosting the cooling performance at lower temperatures, the GTLT expands the range of applications that can be addressed with our technology.”

n

Sunpower

n

The company has also been working to reduce the level of vibrations exported from the cryocoolers, since excessive vibrations created by the oscillating components have limited the adoption of Stirling cryocoolers in certain applications. All the cryocoolers are fitted with a passive balancer as standard, while Sunpower also offers the option of Active Vibration Cancellation (AVC) across all the instruments in its product range.

n

The company’s initial AVC offering reduced the level of exported vibrations by a factor of five, while its latest release – the AVC-GEN2 balancer – delivers a further two-fold performance improvement. “This yields a ten-fold reduction in exported vibrations compared to the passive balancer, which is a critical benefit for customers who are exploring applications in quantum technology,” says Fralick. “Combining the CryoTel GTLT with the AVC-GEN2 active balancer offers a compact solution that delivers the performance needed for these applications.”

n

Indeed, the QKD system developed at TU Berlin makes use of both these innovations to optimize the performance of the quantum-dot emitter while also minimizing vibrations inside the confines of the turnkey module. Another key customer in the quantum sector is UK start-up company Aegiq, which has developed a commercial single-photon source that exploits compact cryogenic cooling, ensuring that its module fits inside a 19-inch rack.

n

Sunpower is continuing to make improvements to its products and technology, with an ongoing drive towards colder temperatures, higher capacity cooling, and lower levels of exported vibration. “Quantum applications are growing quickly, and Sunpower is focused on delivering the technology advances that meet the demands of customers in this field,” says Fralick.

n

The post Cool tricks offer new solutions for quantum networking appeared first on Physics World.

n

]]>
https://hadamard.com/c/cool-tricks-offer-new-solutions-for-quantum-networking/feed/ 0 156
Electrons caught going around the bend https://hadamard.com/c/electrons-caught-going-around-the-bend/ https://hadamard.com/c/electrons-caught-going-around-the-bend/#respond Mon, 06 Nov 2023 15:00:34 +0000 https://physicsworld.com/?p=111019 Continue reading Electrons caught going around the bend]]> Graphs showing the smooth flow of photocurrent streamlines around a microscopic structure shaped like an airplane wing. Several silhouettes of an airplane taking off are shown for comparison

 

Taking inspiration from the flow of air around aeroplane wings, researchers in the US have imaged photoexcited electrons flowing around sharp bends for the first time. Because such bends are often found in integrated optoelectronic circuits, observing the electrons’ “streamlines” could lead to improvements in circuit design.

 

More than 80 years ago, the physicists William Shockley and Simon Ramo showed theoretically that when electrons travel around bends, their streamlines get locally compressed, producing heat. Until now, though, no-one had measured this effect directly because it is so difficult to observe the streamlines of electron photocurrents – that is, electric currents induced by light – through a working device.

 

In the new work, which is described in the Proceedings of the National Academy of Sciences, researchers led by physicists Nathaniel Gabor and David Mayes of the University of California, Riverside built a micromagnetic heterostructure device made from a layer of platinum on a yttrium iron garnet (YIG) substrate and placed it in a rotating magnetic field. They then directed a laser beam onto the YIG, causing the device to heat up and triggering a phenomenon known as the photo-Nernst effect. It is this effect that generates the photocurrent.

 

Observing the overall pattern of streamlines

 

By changing the direction of the external magnetic field, the team “inject the current in such a way that we not only control its source location, but also its direction,” explains Mayes. What is more, he adds, “it turns out that when you measure the electronic response as you do this over and over, you end up observing the overall pattern of streamlines.”

 

To demonstrate the power of their technique, the researchers repeated the experiments on a modified device called an electrofoil that enabled them to contort, compress and expand the photocurrent streamlines in the same way that aeroplane wings contort, compress, and expand the flow of air. In both scenarios, the streamlines represent the direction of flow that yields the greatest response at each point, as predicted by Shockley and Ramo’s theorem.

 

“Back in the late 1930s, these two eminent physicists realized that a free charge in a device does not have to reach an electrode to induce an electric response,” Mayes tells Physics World. “Instead, the motion of the free charges will affect all the other charges in the device due to the Coulomb force.

 

“Shockley and Ramo were able to show that the streamlines not only illustrate the ‘preferred’ current direction for the device, but that they also represent the pattern of current flow through it as if we had simply biased one end of the device and grounded the other.”

 

Avoiding hot spots

 

Gabor notes that being able to determine where current flow lines are being compressed in a device could help circuit designers avoid creating such local hot spots. “The results from our study also suggest that you should not have sharp bend features in your electrical circuit,” he says, adding that gradually curving wires are “not the state-of-the-art right now”.

The researchers are now exploring ways of increasing the resolution of their technique while also testing new devices and materials. In particular, they would like to measure streamlines in devices fashioned into geometries such as a “Tesla valve”, which constrains electron flow in one direction.

“Our measurement tool is a powerful way to visualize and characterize charge flow optoelectronic devices,” says Gabor. “We hope to advance our ideas towards new emerging materials that include both magnetic Nernst-like effects and unconventional current flow behaviour.”

The post Electrons caught going around the bend appeared first on Physics World.

]]>
https://hadamard.com/c/electrons-caught-going-around-the-bend/feed/ 0 158
Tube map of famous engineers, physics of Jackson Pollock, George Washington’s imperial love https://hadamard.com/c/tube-map-of-famous-engineers-physics-of-jackson-pollock-george-washingtons-imperial-love/ https://hadamard.com/c/tube-map-of-famous-engineers-physics-of-jackson-pollock-george-washingtons-imperial-love/#respond Fri, 03 Nov 2023 15:37:56 +0000 https://physicsworld.com/?p=110963 Continue reading Tube map of famous engineers, physics of Jackson Pollock, George Washington’s imperial love]]> Perhaps the most iconic map ever is Harry Beck’s depiction of the London Underground, which first appeared in the 1930s. Now, Transport for London (TfL) – which runs the Underground – has partnered with the Royal Academy of Engineering to create a Tube-themed map that depicts famous people in the history of engineering. Created to celebrate National Engineering Day on 1 November, the entire map can be viewed here.

nn

The American artist Jackson Pollock was famous for abstract paintings made by dripping paint onto canvasses. It turns out that there is a lot of physics in Pollock’s technique. As viscous liquids are poured, there is a rich range of behaviour that can occur. A thin stream of paint can twist around in circles like a coiling rope, and a broad sheet of paint can fall in a cascade of folds.

n

Apparently, Pollock was a master of manipulating these effects to create his stunning paintings. Now, the applied mathematician L Mahadevan has come up with a way to use some of the same effects with a 3D printer. Based at Harvard University in the US, Mahadevan and colleagues were able to use their system to create a range of complicated 3D shapes.

n

Fluid instabilities

n

The work is based on two decades of fluid-dynamics research by Mahadevan, which explains how instabilities in fluid flow result in effects like coiling and folding. “We wanted to develop a technique that could take advantage of the folding and coiling instabilities, rather than avoid them,” says Gaurav Chaudhary, who worked on the project.

n

The team designed an algorithm that worked out how to manipulate the printer nozzle using a technique called deep reinforcement learning. As well as creating Pollock-style paintings,  the team also used their technique to decorate a cookie with chocolate syrup.

n

The technique is described in Soft Matter.

n

Canada made the transition from imperial measurements to the metric system in the mid-1970s. So fellow Canadians my age were taught both systems at school – and are very good at converting between systems. However, our neighbours to the south have so far resisted the metric system, much to the amusement of the rest of the world – and the consternation of some Americans.

n

n

The above comedy sketch from Saturday Night Live explores America’s relationship with imperial measures at the founding of the nation. George Washington expounds on the virtues of having 5280 feet in a mile. He also touches on the puzzling fact that (in the future) soda pop in the US will often be sold by the litre, whereas paint and milk will be sold by the gallon.

n

I think the writers missed a trick by mentioning the ultimate irony of Washington’s enthusiasm for the imperial system – the fact that it was invented by his arch enemy, the British.

n

The post Tube map of famous engineers, physics of Jackson Pollock, George Washington’s imperial love appeared first on Physics World.

n

]]>
https://hadamard.com/c/tube-map-of-famous-engineers-physics-of-jackson-pollock-george-washingtons-imperial-love/feed/ 0 161
Celebrating the physics of the cosmos and 20 years of JCAP https://hadamard.com/c/celebrating-the-physics-of-the-cosmos-and-20-years-of-jcap/ https://hadamard.com/c/celebrating-the-physics-of-the-cosmos-and-20-years-of-jcap/#respond Thu, 02 Nov 2023 09:12:22 +0000 https://physicsworld.com/?p=110935 Continue reading Celebrating the physics of the cosmos and 20 years of JCAP]]> Some of the biggest mysteries of physics – including the nature of dark matter and dark energy, and the origin of the universe – are in the sights of cosmologists and astroparticle physicists.

In this episode of the Physics World Weekly podcast I am in conversation with three editorial board members of the Journal of Cosmology and Astroparticle Physics (JCAP) which is celebrating its 20th anniversary.

 

They are the cosmologist and theoretical physicist Licia Verde who is at the Institute of Cosmos Sciences at Spain’s University of Barcelona; Erminia Calabrese, who is an observational cosmologist at the UK’s Cardiff University; and the astroparticle physicist Anne Green, who is at the University of Nottingham in the UK.

 

We chat about major breakthroughs in cosmology and astroparticle physics over the past two decades and look forward to the future of the fields.

 

This podcast is sponsored by the Electrochemical Society.

 

    • JCAP has published a special retrospective collection of some of the leading papers that have been published in the journal since 2003. Next week, the journal will publish a special anniversary issue of new papers.

The post Celebrating the physics of the cosmos and 20 years of <em>JCAP</em> appeared first on Physics World.

]]>
https://hadamard.com/c/celebrating-the-physics-of-the-cosmos-and-20-years-of-jcap/feed/ 0 164
The brain may learn about the world the same way some computational models do https://hadamard.com/c/the-brain-may-learn-about-the-world-the-same-way-some-computational-models-do/ https://hadamard.com/c/the-brain-may-learn-about-the-world-the-same-way-some-computational-models-do/#respond Mon, 30 Oct 2023 04:00:00 +0000 https://news.mit.edu/2023/brain-self-supervised-computational-models-1030 Continue reading The brain may learn about the world the same way some computational models do]]> <p>To make our way through the world, our brain must develop an intuitive understanding of the physical world around us, which we then use to interpret sensory information coming into the brain.</p>nn<p>How does the brain develop that intuitive understanding? Many scientists believe that it may use a process similar to what’s known as “self-supervised learning.” This type of machine learning, originally developed as a way to create more efficient models for computer vision, allows computational models to learn about visual scenes based solely on the similarities and differences between them, with no labels or other information.</p>nn<p>A pair of studies from researchers at the K. Lisa Yang Integrative Computational Neuroscience (ICoN) Center at MIT offers new evidence supporting this hypothesis. The researchers found that when they trained models known as neural networks using a particular type of self-supervised learning, the resulting models generated activity patterns very similar to those seen in the brains of animals that were performing the same tasks as the models.</p>nn<p>The findings suggest that these models are able to learn representations of the physical world that they can use to make accurate predictions about what will happen in that world, and that the mammalian brain may be using the same strategy, the researchers say.</p>nn<p>“The theme of our work is that AI designed to help build better robots ends up also being a framework to better understand the brain more generally,” says Aran Nayebi, a postdoc in the ICoN Center. “We can’t say if it’s the whole brain yet, but across scales and disparate brain areas, our results seem to be suggestive of an organizing principle.”</p>nn<p>Nayebi is the lead author of <a href=”https://arxiv.org/abs/2305.11772″ target=”_blank”>one of the studies</a>, co-authored with Rishi Rajalingham, a former MIT postdoc now at Meta Reality Labs, and senior authors Mehrdad Jazayeri, an associate professor of brain and cognitive sciences and a member of the McGovern Institute for Brain Research; and Robert Yang, an assistant professor of brain and cognitive sciences and an associate member of the McGovern Institute. Ila Fiete, director of the ICoN Center, a professor of brain and cognitive sciences, and an associate member of the McGovern Institute, is the senior author of the <a href=”https://neurips.cc/virtual/2023/poster/72628″ target=”_blank”>other study</a>, which was co-led by Mikail Khona, an MIT graduate student, and Rylan Schaeffer, a former senior research associate at MIT.</p>nn<p>Both studies will be presented at the 2023 Conference on Neural Information Processing Systems (NeurIPS) in December.</p>nn<p><strong>Modeling the physical world</strong></p>nn<p>Early models of computer vision mainly relied on supervised learning. Using this approach, models are trained to classify images that are each labeled with a name — cat, car, etc. The resulting models work well, but this type of training requires a great deal of human-labeled data.</p>nn<p>To create a more efficient alternative, in recent years researchers have turned to models built through a technique known as contrastive self-supervised learning. This type of learning allows an algorithm to learn to classify objects based on how similar they are to each other, with no external labels provided.</p>nn<p>“This is a very powerful method because you can now leverage very large modern data sets, especially videos, and really unlock their potential,” Nayebi says. “A lot of the modern AI that you see now, especially in the last couple years with ChatGPT and GPT-4, is a result of training a self-supervised objective function on a large-scale dataset to obtain a very flexible representation.”</p>nn<p>These types of models, also called <a href=”https://news.mit.edu/2017/explained-neural-networks-deep-learning-0414″ target=”_blank”>neural networks</a>, consist of thousands or millions of processing units connected to each other. Each node has connections of varying strengths to other nodes in the network. As the network analyzes huge amounts of data, the strengths of those connections change as the network learns to perform the desired task.</p>nn<p>As the model performs a particular task, the activity patterns of different units within the network can be measured. Each unit’s activity can be represented as a firing pattern, similar to the firing patterns of neurons in the brain. Previous work from Nayebi and others has shown that self-supervised models of vision generate activity similar to that seen in the visual processing system of mammalian brains.</p>nn<p>In both of the new NeurIPS studies, the researchers set out to explore whether self-supervised computational models of other cognitive functions might also show similarities to the mammalian brain. In the study led by Nayebi, the researchers trained self-supervised models to predict the future state of their environment across hundreds of thousands of naturalistic videos depicting everyday scenarios.&nbsp;&nbsp;&nbsp;&nbsp;</p>nn<p>“For the last decade or so, the dominant method to build neural network models in cognitive neuroscience is to train these networks on individual cognitive tasks. But models trained this way rarely generalize&nbsp;to other tasks,” Yang says. “Here we test whether we can build models for some aspect of cognition by first training on naturalistic data using self-supervised learning, then evaluating in lab settings.”</p>nn<p>Once the model was trained, the researchers had it generalize to a task they call “Mental-Pong.” This is similar to the video game Pong, where a player moves a paddle to hit a ball traveling across the screen. In the Mental-Pong version, the ball disappears shortly before hitting the paddle, so the player has to estimate its trajectory in order to hit the ball.</p>nn<p>The researchers found that the model was able to track the hidden ball’s trajectory with accuracy similar to that of neurons in the mammalian brain, which had been shown in a previous study by Rajalingham and Jazayeri to simulate its trajectory — a cognitive phenomenon known as “mental simulation.” Furthermore, the neural activation patterns seen within the model were similar to those seen in the brains of animals as they played the game — specifically, in a part of the brain called the dorsomedial frontal cortex. No other class of computational model has been able to match the biological data as closely as this one, the researchers say.</p>nn<p>“There are many efforts in the machine learning&nbsp;community to create artificial intelligence,” Jazayeri says. “The relevance of these models to neurobiology hinges on their ability to additionally capture the inner workings of the brain.&nbsp;The fact that Aran’s model predicts neural data is really important as it suggests that we may be getting closer to building artificial systems that emulate natural intelligence.”</p>nn<p><strong>Navigating the world</strong></p>nn<p>The study led by Khona, Schaeffer, and Fiete focused on a type of specialized neurons known as grid cells. These cells, located in the entorhinal cortex, help animals to navigate, working together with place cells located in the hippocampus.</p>nn<p>While place cells fire whenever an animal is in a specific location, grid cells fire only when the animal is at one of the vertices of a triangular lattice. Groups of grid cells create overlapping lattices of different sizes, which allows them to encode a large number of positions using a relatively small number of cells.</p>nn<p>In recent <a href=”https://news.mit.edu/2022/neural-networks-brain-function-1102″ target=”_blank”>studies</a>, researchers have trained supervised neural networks to mimic grid cell function by predicting an animal’s next location based on its starting point and velocity, a task known as path integration. However, these models hinged on access to privileged information about absolute space at all times — information that the animal does not have. &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</p>nn<p>Inspired by the striking coding properties of the multiperiodic grid-cell code for space, the MIT team trained a contrastive self-supervised model to both perform this same path integration task and represent space efficiently while doing so. For the training data, they used sequences of velocity inputs. The model learned to distinguish positions based on whether they were similar or different — nearby positions generated similar codes, but further positions generated more different codes.&nbsp;&nbsp;&nbsp;&nbsp;</p>nn<p>“It’s similar to training models on images, where if two images are both heads of cats, their codes should be similar, but if one is the head of a cat and one is a truck, then you want their codes to repel,” Khona says. “We’re taking that same idea but applying it to spatial trajectories.”</p>nn<p>Once the model was trained, the researchers found that the activation patterns of the nodes within the model formed several lattice patterns with different periods, very similar to those formed by grid cells in the brain.</p>nn<p>“What excites me about this work is that it makes connections between mathematical work on the striking information-theoretic properties of the grid cell code and the computation of path integration,” Fiete says. “While the mathematical work was analytic — what properties does the grid cell code possess? — the approach of optimizing coding efficiency through self-supervised learning and obtaining grid-like tuning is synthetic: It shows what properties might be necessary and sufficient to explain why the brain has grid cells.”</p>nn<p>The research was funded by the K. Lisa Yang ICoN Center, the National Institutes of Health, the Simons Foundation, the McKnight Foundation, the McGovern Institute, and the Helen Hay Whitney Foundation.</p>

]]>
https://hadamard.com/c/the-brain-may-learn-about-the-world-the-same-way-some-computational-models-do/feed/ 0 243
Three-qubit computing platform is made from electron spins https://hadamard.com/c/three-qubit-computing-platform-is-made-from-electron-spins/ https://hadamard.com/c/three-qubit-computing-platform-is-made-from-electron-spins/#respond Sat, 28 Oct 2023 14:38:19 +0000 https://physicsworld.com/?p=110865 Continue reading Three-qubit computing platform is made from electron spins]]> Electron spin qubits

n

A quantum computing platform that is capable of the simultaneous operation of multiple spin-based quantum bits (qubits) has been created by researchers in South Korea. Designed by Yujeong Bae, Soo-hyon Phark, Andreas Heinrich and colleagues at the Institute for Basic Science in Seoul, the system is assembled atom-by-atom using a scanning tunnelling microscope (STM).

nn

While quantum computers of the future should be able to outperform conventional computers at certain tasks, today’s nascent quantum processors are still too small and noisy to do practical calculations. Much more must be done to create viable qubit platforms that can retain information for long enough for quantum computers to be viable.

n

Qubits have already been developed using several different technologies, including supercomputing circuits and trapped ions. Some physicists are also keen on creating qubits using the spins of individual electrons – but such qubits are not as advanced as some of their counterparts. However, that does not mean that spin-based qubits are out of the running.

n

“At this point, all existing platforms for quantum computing have major drawbacks, so it is imperative to investigate new approaches,” explains Heinrich.

n

Precise assembly

n

To create a viable spin-based processor, qubits must be assembled precisely, coupled together reliably, and operated in a quantum-coherent manner, all on the same platform. This is something that has so far eluded researchers, until now – according to the Seoul-based team.

n

The researchers created their multi-qubit platform with the help of an STM, which is a powerful tool for imaging and manipulating matter on atomic scales. When the conducting tip of an STM is brought very close to a sample surface, electrons are able to quantum-mechanically tunnel between the tip and the sample surface.

n

Since the probability of tunnelling strongly depends on the distance between tip and surface, an STM can map out the sample’s nanoscale topography by measuring the current of these tunnelling electrons. Individual atoms on surface can also be manipulated and assembled by pushing them around by the nanoscale forces applied by the tip.

n

Using these capabilities the team has “demonstrated the first qubit platform with atomic scale precision,” according to Heinrich. “It is based on electron spins on surfaces, which can be placed at atomically precise distances from each other.”

n

Sensor qubit

n

Using STM, the researchers assembled their system on the pristine surface of a magnesium oxide bilayer film. The system includes a “sensor” qubit, which is a spin-1/2 titanium atom that is located directly below the STM tip. The tip is coated in iron atoms, which means that it can be used to apply a local magnetic field (see figure).

n

To either side of the tip are a pair of “remote” qubits – also spin-1/2 titanium atoms. These are placed at precise distances from the sensor qubit, outside the region where electron tunnelling between atoms can occur.

n

To control the remote qubits simultaneously with the sensor qubit, the team created a magnetic field gradient by placing iron atoms nearby. The iron atoms behave as single-atom magnets because their spin relaxation times far exceed the operation times of individual qubits.

n

In this way, the iron atoms each act as a substitute for the STM tip in providing a static, local magnetic field for aligning the spins of each remote qubit. Transitions between the spin states of the qubits are done by using the STM tip to apply radio-frequency pulses to the system – a technique called electron spin resonance.

n

Addressed and manipulated

n

The team initialised their qubits by cooling them to 0.4 K, then applying an external magnetic field to bring them into the same spin state and coupling them together. Afterwards, the state of the sensor qubit depended reliably on the states of both remote qubits, but could still be addressed and manipulated individually by the STM tip.

n

The overall result was entirely new qubit platform that allowed multiple qubits to be operated simultaneously. “Our study has achieved single qubit, two qubit, and three qubit gates with good quantum coherence,” Heinrich says.

n

He adds that, “the platform has its pros and cons. On the pros, it is atomically precise and hence can be easily duplicated. On the cons, the quantum coherence is good but needs to be improved further.”

n

If these challenges can be overcome, Heinrich and colleagues see a bright future for their system.

n

“We believe that this approach can relatively easily be scaled to tens of electron qubits,” Heinrich says. “Those electron spins can also be controllably coupled to nuclear spins which might enable efficient quantum error correction and increase the available Hilbert space for quantum operations. We have just scratched the surface!”

n

The research is described in Science.

n

The post Three-qubit computing platform is made from electron spins appeared first on Physics World.

n

]]>
https://hadamard.com/c/three-qubit-computing-platform-is-made-from-electron-spins/feed/ 0 174
What can postage stamps tell us about the history of nuclear physics? https://hadamard.com/c/what-can-postage-stamps-tell-us-about-the-history-of-nuclear-physics/ https://hadamard.com/c/what-can-postage-stamps-tell-us-about-the-history-of-nuclear-physics/#respond Fri, 27 Oct 2023 10:00:41 +0000 https://physicsworld.com/?p=110339 Continue reading What can postage stamps tell us about the history of nuclear physics?]]> In December 1942 US president Franklin D Roosevelt signed the Manhattan Project into existence. A scientific endeavour that culminated in the dropping of the Little Boy and Fat Man bombs three years later, the project was – for better or worse – the most significant development in the long history of nuclear physics. What is perhaps surprising, though, is that this pioneering field of discovery is captured forever through the medium of postage stamps.

n

Marie Curie has appeared on more than 600 postage stamps and holds the record as the physicist with the most stamps ever issued in their name

n

n

Our story begins with Marie Curie, who shared the 1903 Nobel Prize for Physics with Pierre Curie for their studies of radioactivity. This phenomenon had been discovered in 1896 by Henri Becquerel, who won the other half of that year’s prize, but it is Marie Curie who is easily the most famous of the three scientists. She has appeared on more than 600 postage stamps and therefore holds the record as the physicist with the most stamps ever issued in their name. My favourite is the 1938 Afghanistan 15 pul stamp, which is the only one featuring Curie with her electrometer and was also the first stamp to depict a female scientist.

nn

From her lab in Paris, Curie famously studied the radiation emitted by pitchblende – a glowing mix of uranium oxide and lead, which hailed from the Jáchymov mine in Bohemia, now part of Czechia. Known for its production of silver, the ore was delivered to Curie, who also used it to discover the elements polonium and radium. The mine’s fame as the birthplace of nuclear science was commemorated by the former Czechoslovakia in 1966 with a 60 haléř stamp (click here to view).

n

Ernest Rutherford – the New-Zealand-born physicist who discovered the atomic nucleus – is also commemorated on several stamps. One I particularly like was issued by New Zealand in 1971 to commemorate the centenary of his birth. The 1 cent stamp of the set includes a portrait of Rutherford along with a diagram of the Rutherford atomic model, which – correctly – envisaged electrons surrounding a dense central nucleus. The stamp nicely shows alpha particles being scattered back from the nucleus – the famous “gold-foil” experiment found in every school physics syllabus.

n

1 cent New Zealand stamp

n

Rutherford could – and perhaps should – have won a Nobel prize for his discovery of the nucleus but he of course won the Nobel Prize for Chemistry in 1908 for his work on the decay of radium. The Nobel committee obviously viewed radioactivity as chemistry, not physics, prompting Rutherford to famously remark that he had dealt with many different transformations, but that the quickest was his “own transformation in one moment from a physicist to a chemist”. Be that as it may, winning a Nobel prize is a sure-fire way to philatelic fame.

nn

The Danish physicist Niels Bohr – who won the 1922 Nobel Prize for Physics for his work on the structure of atoms – has appeared on several Swedish stamps but my favourite is actually a Greenland 1963 issue, celebrating 50 years of “Bohr theory”, which describes how electrons exist in discrete orbits and can jump between them. I like this stamp because rather than containing just a visual portrait of the scientist, as was the trend until then, it also depicts Bohr’s work in the form of an equation (hν = E2E1) and a diagram of orbiting electrons.

n

1963 Greeland stamp showing a photo of Niels Bohr and an illustration of his electron model

n

As the 1920s turned into the 1930s, the pace of research in nuclear physics picked up. In 1932 James Chadwick discovered the neutron. In 1938 Otto Hahn and Fritz Strassman, along with Lise Meitner and Otto Frisch (working under Bohr), discovered atomic fission. In 1939 Frédéric Joliot-Curie, Enrico Fermi and Leo Szilard confirmed the chain reaction experimentally. The final pieces of the bomb jigsaw were provided by Francis Perrin, who calculated the critical mass of uranium needed for a self-sustaining reaction, along with further work from Rudolf Peierls in Birmingham, UK.

n

Images on postage stamps are a great reminder of the role of science in the world around us and yet, they can also entrench inequities

n

n

Discovery in science is a bit like a self-sustaining reaction, in which new ideas are built on old ones and researchers stand on the shoulders of the giants who went before. Images of postage stamps are a great reminder of the role of science in the world around us and yet, they can also entrench inequities. The beautiful 60 pfennig German stamp first issued in 1979 (click here to view), for example, shows the splitting of a uranium nucleus but it mentions only Hahn, who was awarded the 1944 Nobel Prize for Chemistry. His co-discoverers – Meitner, Strassman and Frisch – who were left empty-handed are, once again, omitted from history.

n

Stamps don’t just reflect history but can shape it too.

n

The post What can postage stamps tell us about the history of nuclear physics? appeared first on Physics World.

n

]]>
https://hadamard.com/c/what-can-postage-stamps-tell-us-about-the-history-of-nuclear-physics/feed/ 0 177
Pioneering the physics of adaptation, writing the history of quantum computing https://hadamard.com/c/pioneering-the-physics-of-adaptation-writing-the-history-of-quantum-computing/ https://hadamard.com/c/pioneering-the-physics-of-adaptation-writing-the-history-of-quantum-computing/#respond Thu, 26 Oct 2023 13:39:09 +0000 https://physicsworld.com/?p=110854 Continue reading Pioneering the physics of adaptation, writing the history of quantum computing]]> n

This episode of the Physics World Weekly podcast features two pioneers in their fields.

n

Margaret Gardel is a biophysicist who is setting up a new National Science Foundation Physics Frontier Center at the University of Chicago. The Center for Living Systems will focus on the physics of adaptation, a new field that looks at how living matter stores, retrieves, and processes information as it adapts to change. Gardel explains how physics-inspired theory and experiments are providing fresh insights into biological systems.

nn

Our second pioneer is Susannah Glickman who has just completed what is probably the first scholarly history of quantum computing. A historian based at Stony Brook University in the US, Glickman explains why there has been so much enthusiasm for quantum computers, despite the fact that that the technology is far from settled. She also talks about the process of writing her history and the generosity of some of the quantum-computing experts who provided her with crucial information about how the field has developed.

n

The post Pioneering the physics of adaptation, writing the history of quantum computing appeared first on Physics World.

n

]]>
https://hadamard.com/c/pioneering-the-physics-of-adaptation-writing-the-history-of-quantum-computing/feed/ 0 178
Bright flash leads astronomers to a heavy-metal factory 900 million light years away https://hadamard.com/c/bright-flash-leads-astronomers-to-a-heavy-metal-factory-900-million-light-years-away/ https://hadamard.com/c/bright-flash-leads-astronomers-to-a-heavy-metal-factory-900-million-light-years-away/#respond Wed, 25 Oct 2023 15:00:00 +0000 https://news.mit.edu/2023/bright-flash-leads-astronomers-tellurium-detection-1025 Continue reading Bright flash leads astronomers to a heavy-metal factory 900 million light years away]]> <p>An extraordinary burst of high-energy light in the sky has pointed astronomers to a pair of metal-forging neutron stars 900 million light years from Earth.</p>nn<p>In a <a href=”https://www.nature.com/articles/s41586-023-06759-1″ target=”_blank”>study appearing today</a> in <em>Nature</em>, an international team of astronomers, including scientists at MIT, reports the detection of an extremely bright gamma-ray burst (GRB), which is the most powerful type of explosion known in the universe. This particular GRB is the second-brightest so far detected, and the astronomers subsequently traced the burst’s origin to two merging neutron stars. Neutron stars are the collapsed, ultradense cores of massive stars, and are thought to be where many of the universe’s heavy metals are forged.</p>nn<p>The team found that as the stars circled each other and eventually merged, they gave off an enormous amount of energy in the form of the GRB. And, in a first, the astronomers directly detected signs of heavy metals in the stellar aftermath. Specifically, they picked up a clear signal of tellurium, a heavy, mildly toxic element that is rarer than platinum on Earth but thought to be abundant throughout the universe.</p>nn<p>The astronomers estimate that the merger gave off enough tellurium to equal the mass of 300 Earths. And if tellurium is present, the merger must have churned up other closely related elements such as iodine, which is an essential mineral nutrient for much of life on Earth.</p>nn<p>The discovery was made through the collective effort of astronomers around the world, using NASA’s James Webb Space Telescope (JWST) as well as other ground and space telescopes, including NASA’s TESS satellite (an MIT-led mission), and the Very Large Telescope (VLT) in Chile, which scientists at MIT used to contribute to the discovery.</p>nn<p><strong>“</strong>This discovery is a major step forward in our understanding of the formation sites of heavy elements in the universe, and demonstrates the power of combining observations in different wavelengths&nbsp;to reveal new insights into these extremely energetic explosions,” says study co-author Benjamin Schneider, a postdoc in MIT’s Kavli Institute for Astrophysics and Space Research.</p>nn<p>Schneider is one of many researchers from multiple institutions around the world who contributed to the study, which was led by Andrew Levan of Radboud University in the Netherlands and the University of Warwick in the United Kingdom.</p>nn<p><strong>“Everything all at once”</strong></p>nn<p>The initial burst was detected on March 7, 2023, by NASA’s Fermi Gamma-Ray Space Telescope, and was determined to be an exceptionally bright gamma-ray burst, which astronomers labeled GRB 230307A.</p>nn<p>“It might be difficult to overstate how bright it was,” says Michael Fausnaugh, who was a research scientist at MIT at the time and is now an assistant professor at Texas Tech University. “In gamma-ray astronomy, you’re usually counting individual photons. But so many photons came in that the detector couldn’t distinguish individual ones. It was kind of like the dial hit the max.”</p>nn<p>The ultrabright burst was also exceptionally long, lasting 200 seconds, whereas neutron star mergers typically result in short GRBs that flash for less than two seconds. The bright and long-lasting flare drew immediate interest around the world, as astronomers focused a host of other telescopes towards the burst. This time, the burst’s brightness worked to scientists’ advantage, as the gamma-ray flare was detected by satellites across the solar system. By triangulating these observations, astronomers could zero in on the burst’s location — in the southern sky, within the Mensa constellation.</p>nn<p>At MIT, Schneider and Fausnaugh joined the multipronged search. Shortly after Fermi’s initial detection, Fausnaugh checked to see whether the burst showed up in data taken by the TESS satellite, which happened to be pointing toward the same section of the sky where GRB 230307A was initially detected. Fausnaugh went back through that portion of TESS data and spotted the burst, then traced its activity from beginning to end.</p>nn<p>“We could see everything all at once,” Fausnaugh says. “We saw a really bright flash, followed by a little bump, or afterglow. That was a very unique light curve. Without TESS, it is almost impossible to observe the early optical flash that occurs at the same time as the gamma rays.”&nbsp;&nbsp;</p>nn<p>Meanwhile, Schneider examined the burst with another, ground-based scope: the Very Large Telescope (VLT) in Chile. As a member of a large GRB-observing program running on this telescope, Schneider happened to be on shift soon after the Fermi’s initial observation and focused the telescope toward the burst.</p>nn<p>VLT’s observations echoed TESS’ data and revealed an equally curious pattern: The GRB’s emissions appeared to transition quickly from blue to red wavelengths. This pattern is characteristic of a kilonova — a massive explosion that typically occurs when two neutron stars collide. The MIT group’s analyses, combined with other observations around the world, helped to determine that the GRB was likely the product of two merging neutron stars.</p>nn<p><strong>A stellar kick</strong></p>nn<p>Where did the merger itself originate? For this, astronomers turned to the deep-field view of JWST, which can see further into space than any other telescope. Astronomers used JWST to observe GRB 230307A, hoping to pick out the host galaxy where the neutron stars originated. The telescope’s images revealed that, strangely, the GRB appeared to be unmoored from any host galaxy. But there did appear to be a nearby galaxy, some 120,000 light years away.</p>nn<p>The telescope’s observations suggest that the neutron stars were kicked out of the nearby galaxy. They likely formed as a pair of massive stars in a binary system. Eventually, both stars collapsed into neutron stars, in powerful events that effectively “kicked” the pair out of their home galaxy, causing them to escape to a new location where they slowly circled in on each other and merged, several hundred million years later.</p>nn<p>Amid the merger’s energetic emissions, JWST also detected a clear signal of tellurium. While most stars can churn up lighter elements up to iron, it’s thought that all other, heavier elements in the universe were forged in more extreme environments, such as a neutron star merger. JWST’s detection of tellurium further confirmed that the initial gamma-ray burst was produced by a neutron star merger.</p>nn<p>“For JWST, it’s only the beginning, and it has already made a huge difference,” Schneider says.&nbsp;“In the coming years, more neutron star mergers will be detected. The combination of JWST with other powerful observatories will be crucial for shedding light on the nature of these extreme explosions.”</p>

]]>
https://hadamard.com/c/bright-flash-leads-astronomers-to-a-heavy-metal-factory-900-million-light-years-away/feed/ 0 245
Electrons accelerated by firing lasers into nanophotonic cavities https://hadamard.com/c/electrons-accelerated-by-firing-lasers-into-nanophotonic-cavities/ https://hadamard.com/c/electrons-accelerated-by-firing-lasers-into-nanophotonic-cavities/#respond Wed, 25 Oct 2023 08:38:43 +0000 https://physicsworld.com/?p=110809 Continue reading Electrons accelerated by firing lasers into nanophotonic cavities]]> Laser-driven particle accelerators on silicon chips have been created by two independent research groups. With further improvements, such dielectric laser accelerators could be used in medicine and industry – and could even find application in high-energy particle physics experiments.

nn

Accelerating electrons to high energies is normally done over long distances at large and expensive facilities. The electron accelerator at the heart of the European X-ray Free Electron Laser in Germany, for example, is 3.4 km long and the Stanford Linear Accelerator (SLAC) in California was 3.2 km long.

n

As a result, the use of electron accelerators for practical applications in medicine and industry is severely restricted. Size and cost are also factors in accelerator-based particle physics, where facilities are getting bigger and more expensive as they reach for higher collision energies.

n

Surfers on a wave

n

In a conventional accelerators, microwave oscillations of electric fields in metallic cavities accelerate electrons like surfers on a travelling wave. The maximum acceleration gradient is typically a few dozen megavolts per metre, and is defined by the maximum electric field that can exist between metallic components in a cavity.

n

“Nobody knows exactly what’s happening at the [metallic] surface and this is still an active field of research…but when the fields get too large something like tiny little pyramids grow on the surface, and then electrons spray out and the field just breaks down,” says Peter Hommelhoff of Friedrich-Alexander University Erlangen-Nürnberg in Germany.

n

The cost and technological challenges of conventional accelerators mean that researchers are keen on developing alternative acceleration methods. In this latest research, the oscillating electric fields are created by firing laser pulses into tiny optical cavities made from silicon nanostructures.

n

Hommelhoff says it took almost thirty years before physicists realized that electron acceleration could also be achieved using nanophotonic cavities driven by optical-frequency light. Using optical light helps scale down the device because the wavelength of the radiation is much shorter than that of microwaves.

n

No metal required

n

Hommelhoff points out another important benefit of this approach: “When you drive these frequencies with laser light, you don’t need metal structures”. He adds, “It suffices if you just use regular glass…and you can generate the same mode that you can generate with microwave cavities and microwave fields”.

n

As the cavity is an insulator, high concentrations of charge do not appear at points on the surface. As a result, the only limit to the acceleration gradient is the electrical breakdown field of the material.

n

In principle, this allows for the nanophotonic integration of a particle accelerator, producing bunches of electrons in a tiny, precisely-focused beamline. However, there are practical challenges. The electrons in each bunch repel each other and holding a bunch together requires focusing by external forces. Moreover, compression of a bunch in one direction causes it to spread in other directions.

n

Repulsion problem

n

In previous work, researchers including Hommelhoff and Olav Solgaard of Stanford University in California have demonstrated that this repulsion problem could be mitigated using alternating phase focusing. In this technique, electrons are alternately confined in one direction and then the other, producing an oscillating field distribution.

n

Now, new work on these accelerators has been done by two independent research groups. One was led by Hommelhoff at Friedrich-Alexander University. The other group was a collaboration between Stanford scientists led by Solgaard and researchers at TU Darmstadt in Germany led by Uwe Niedermayer. Both teams created nanophotonic dielectric laser accelerators that boosted the energy of electron bunches without the bunches breaking up. Solgaard and Niedermeyer’s team fabricated two accelerators – one designed at Stanford and one at TU Darmstadt. One accelerator boosted the energy of 96 keV electrons by 25% over a distance of just 708 μm. This is about ten times the thickness of a human hair.

n

“I think that I have put more force on an electron than anybody else ever,” says Solgaard.

n

The Hommelhoff group’s device worked at lower energies, accelerating electrons from 28.4 keV to 40.7 keV over 500 μm. This presented its own challenges, as Hommelhoff explains. “When you want to accelerate electrons that are non-relativistic – in our case they only travel with one third of the speed of light – it’s not so easy and it’s less efficient to generate the optical mode that co-propagates with the electrons.”

n

Higher breakdown fields

n

The researchers are now looking to achieve even higher field gradients by fabricating devices in materials with higher breakdown fields than silicon. They believe that in the near term their acceleration schemes could find applications in medical imaging and in searches for dark matter.

nn

Solgaard says he “might be in a very small minority thinking this is going to play a role in high-energy physics,” but that the technology should be usable in materials such as quartz, whose breakdown field is almost 1000 times that of a traditional accelerator. “Our millimetre becomes a metre,” he says; “by the time we get to a metre we should match SLAC in energy…Think about having an accelerator sitting in my office that matches SLAC.”

n

“I think these [two teams] have demonstrated an important new step towards a real accelerator on a chip,” says accelerator scientist Carsten Welsch of the University of Liverpool in the UK. However, he cautions that much remains to be done in terms of beam control and miniature diagnostics. In terms of applications, he says:  “I share their optimism for catheter-like medical applications, bringing electrons to where they are needed, and in particular for mini-light sources where personally I see the biggest potential. The combination of a high quality electron beam and light could really open completely new research opportunities and applications.”

n

However, Welsch remains unconvinced about applications such as particle colliders, pointing to the required high luminosity and high beam quality needed in such machines. “The next Large Hadron Collider will not be a dielectric laser accelerator,” he concludes.

n

Hommelhoff and colleagues describe their work in Nature. Solgaard, Niedermeyer and colleagues describe their work on arXiv.

n

The post Electrons accelerated by firing lasers into nanophotonic cavities appeared first on Physics World.

n

]]>
https://hadamard.com/c/electrons-accelerated-by-firing-lasers-into-nanophotonic-cavities/feed/ 0 181
Proton therapy on an upward trajectory while FLASH treatment schemes get ready to shine https://hadamard.com/c/proton-therapy-on-an-upward-trajectory-while-flash-treatment-schemes-get-ready-to-shine/ https://hadamard.com/c/proton-therapy-on-an-upward-trajectory-while-flash-treatment-schemes-get-ready-to-shine/#respond Tue, 24 Oct 2023 15:30:10 +0000 https://physicsworld.com/?p=110776 Continue reading Proton therapy on an upward trajectory while FLASH treatment schemes get ready to shine]]> While proton therapy has well and truly arrived as a mainstream treatment option in radiation oncology – there are currently 42 operational proton facilities in the US and a further 13 centres under construction – it’s evident that the clinical innovation is only just getting started when it comes to at-scale deployment of protons for the treatment of cancer. That’s one of the key take-aways to emerge from a dedicated conference session – Innovative Radiation Therapy Approaches: Benefits, Challenges, Global Perspective – at the ASTRO Annual Meeting in San Diego, CA, earlier this month.

In terms of precision targeting, the case for proton therapy versus conventional radiotherapy is clear enough. Think similar tumour-killing properties as photons, but with markedly decreased dose to normal tissue. All of which helps the radiation oncology team treat tumours close to organs-at-risk (OARs), with the potential for decreased side-effects and complications along the way.

“Protons release all their energy at a point and then they stop,” explained James Metz, chair of radiation oncology at the University of Pennsylvania (UPenn) and executive director of the OncoLink cancer education service. That means no radiation dose beyond the target as well as far less dose deposited in front of the target compared with photon and electron irradiation.

James Metz

As such, clinicians are able to target the tumour layer-by-layer with pencil-beam-scanned proton delivery. “We take a tumour, divide it up voxel-by-voxel into 5 mm3 volumes and take this pencil beam and treat [complex structures] spot-by-spot with absolutely no exit dose,” Metz noted. “Protons give us the opportunity to reduce dose to normal structures, to combine with chemotherapy and immunotherapy, and to increase [radiation] doses going forwards.”

Notwithstanding the ongoing roll-out of proton therapy systems across the developed world – clinical uptake is similar for the US, Europe and Asia, although currently there’s only one proton treatment centre in sub-Saharan Africa – it’s apparent that “gold-standard” evidence for the clinical efficacy of protons is still a work-in-progress. “We need to systematically evaluate the clinical potential and define it through rigorous science – quantifying the benefits versus investment,” argued Metz. “After all, substantial resource and infrastructure are needed to support a proton therapy centre.”

The evidence is coming – and sooner than later. A number of randomized phase III clinical trials are accruing data or have recently closed for diverse cancer indications (including lung, oesophageal, liver, head-and-neck and brain). Meanwhile, pragmatic trials are also accruing well and evaluating proton treatments in routine clinical practice for patients with breast cancer and prostate cancer.

FLASH the disruptor

 

Metz, for his part, is one of the clinical pioneers of proton therapy, having led the development programme for the Roberts Proton Therapy Center in Philadelphia – a facility that has treated thousands of cancer patients using protons since it opened its doors in 2010. Clinical innovation being what it is, however, attention is already turning to what’s being touted as the “next big thing” in particle therapy: FLASH proton therapy.

 

For context, FLASH is an experimental treatment modality that involves ultrahigh-dose rate delivery (above 60–80 Gy/s) of ionizing radiation (electron, photon or proton) over very short durations (less than 1 s). Preclinical studies have shown that FLASH radiotherapy is less toxic to normal tissues and as effective as conventional radiotherapy at destroying tumours. If broadly validated, FLASH treatment schemes therefore have the potential to revolutionize radiotherapy – such that higher doses could be delivered safely to tumours or established doses be given with reduced toxicity to OARs.

 

In short, FLASH proton therapy is shaping up as a future disruptor in radiation oncology, argued Metz, “bringing together biology and technology in new ways…and turning radiobiology on its head a bit”. The upsides are already coming into view.  For starters, FLASH proton therapy could significantly compress radiation treatment times, such that radiotherapy becomes more like a surgical procedure.

That’s good news for the patient along several coordinates – opening a path to improved quality-of-life, reduced toxicity and side-effects, as well as much less time spent in the clinic. At a more fundamental level, FLASH irradiation can also trigger different immune pathways and gene expression, creating novel opportunities for drug and radiation combinations.

Yet while FLASH has the potential to upend treatment paradigms and many current assumptions about radiation delivery, Metz concluded on a cautionary note: “I would say FLASH proton therapy is not yet ready for prime-time…[and] not ready to be deployed further than a few highly resourced centres that can complete the appropriate research and clinical trials.”

Clinical innovation: it’s all about outcomes

Alongside the clinical opportunities afforded by proton therapy, the ASTRO session on Innovative Radiation Therapy Approaches covered plenty of other bases. Tamer Refaat, professor of radiation oncology at Loyola University in Chicago, Illinois, kicked off with a status report on MR-guided radiotherapy (MRgRT).

 

“The big deal [with MRgRT] is real-time adaptation,” Refaat told delegates. In other words, personalized, daily-adapted radiotherapy that’s based on real-time and on-table patient anatomy, allowing the clinical team to maximize dose to the target volume and minimize dose to OARs.

As for MRgRT innovations to watch, Refaat highlighted the commercial and clinical roll-out of cine-gating functionality to enhance the treatment of upper abdominal tumours on a single phase of breathing. “The radiation beam turns on whenever the target is within the tracking boundary and turns off when outside,” he explained (adding that the downside is longer time on the treatment table for the patient).

Tamer Refaat

Tamer Refaat “The big deal [with MRgRT] is real-time adaptation.” (Courtesy: Loyola University)Incorporation of functional MRgRT into the MR-Linac workflow also came under the spotlight, with Refaat citing researchers at MD Anderson Cancer Center (Houston, Texas) among the early-adopters seeking to identify radioresistant tumour subvolumes and escalate dose to those subvolumes accordingly.

Another hot topic centred on the combined-modality synergies of integrating immunotherapy and radiotherapy cancer treatments. The speaker, Silvia Formenti, a radiation oncologist at Weill Cornell Medicine in New York, is one of the main-movers behind a paradigm shift in radiobiology, her efforts elucidating the role of ionizing radiation on the immune system while demonstrating the efficacy of combined radiotherapy–immunotherapy regimes in solid tumours.

Formenti highlighted the pivotal role played in this regard by the ImmunoRad Radiation Oncology-Biology Integration Network (ROBIN). A multidisciplinary R&D collaboration between US and European cancer centres, ROBIN is seeking to better understand the interaction of radiation therapy and the immune response – as well as nurturing the talent pipeline of early-career scientists into the field. Right now, noted Formenti, the bigger picture is clouded by “financial toxicity”, with the cost of immunotherapy proving prohibitive for most low- and middle-income nations – as well as many Americans.

The focus on collaborative clinical research was echoed by Stephen Harrow, a consultant clinical oncologist at the Edinburgh Cancer Centre in Scotland. In the final talk of the session, he discussed the application of stereotactic body radiotherapy (SBRT) for oligometastatic disease.

Post-pandemic, Harrow highlighted how the Scottish Oligomet SABR Network (SOSN), aided by £1 million of Scottish government funding, has enabled Scotland’s five cancer centres to offer a joined-up SBRT treatment service to patients across the country (not just the highly populated central belt encompassing Glasgow and Edinburgh).

The goal of SOSN, he explained, is to “build a network of physicians, physicists and radiographers so that we’re all agreed on patient selection [criteria for SBRT] and we have equity for patients across the country”. What’s more, he added, “the evidence is definitely building that you can influence patient outcomes with SBRT for oligomet disease.”

 

The post Proton therapy on an upward trajectory while FLASH treatment schemes get ready to shine appeared first on Physics World.

 

]]>
https://hadamard.com/c/proton-therapy-on-an-upward-trajectory-while-flash-treatment-schemes-get-ready-to-shine/feed/ 0 182
Multi-eye-component imaging could help diagnose ocular disease https://hadamard.com/c/multi-eye-component-imaging-could-help-diagnose-ocular-disease/ https://hadamard.com/c/multi-eye-component-imaging-could-help-diagnose-ocular-disease/#respond Tue, 24 Oct 2023 13:41:03 +0000 https://physicsworld.com/?p=110743 Continue reading Multi-eye-component imaging could help diagnose ocular disease]]> a technique called reverberant optical coherence elastography (RevOCE) measures the elasticity or stiffness of eye structures with high resolution

n

A new technique evaluates the biomechanical properties of the eye with much better elastic resolution than current methods, raising hopes for more effective diagnostics and therapies. The new approach, put forward by researchers from the University of Houston in Texas in the US, goes by the name of multifocal acoustic radiation force reverberant optical coherence elastography, and might also improve our understanding of how different eye components function.

n

Eyes are extremely complex organs made up of several types of specialized tissue that help maintain intraocular pressure and thus allow us to see clearly. This normal biomechanical function is disrupted, however, in diseases such as keratoconus and glaucoma. Understanding how these changes happen is vital for diagnosing and treating ocular pathologies, but for that to be possible, clinicians need to be able to assess biomechanical properties such as stiffness. Current methods of doing this have important limitations. Magnetic resonance imaging (MRI), for example, is costly and requires patients to remain still for long periods. This includes not moving their eyes, since even small movements can lead to errors in measurements.

n

Creating 2D or 3D elasticity maps

n

In the new study, researchers led by biomedical engineer Kirill Larin employed a technique called reverberant optical coherence elastography (RevOCE) to measure the elasticity or stiffness of eye structures with high resolution. RevOCE involves using a low-power light source to scan a specific volume of the eyeball and then detecting the complex interference patterns produced by vibrating mechanical waves that subsequently propagate through the eye tissue. These patterns are then used to create 2D or 3D elasticity maps of the region studied.

nn

The drawback is that the complex interference patterns of mechanical waves are difficult to generate, and the usual methods of doing so require the source of the waves – a mechanical shaker – to contact eye tissues directly. This can be very uncomfortable for the patient.

n

Larin and colleagues’ innovation was to generate reverberant shear waves with RevOCE in a much less invasive way, without compromising on resolution. Their new system comprises a multifocal acoustic radiation (ARF) system with an ultrasound generator coupled to an array of acoustic lenses. These lenses focus the ultrasound waves to produce three distinct ARF beams spaced a few millimetres apart. These beams are then sent through a target area within the eye, where they induce vibrating shear waves that can be detected by an optical coherence tomography device and post-processed to reconstruct a 3D elastography map of the entire eyeball.

n

Comprehending overall eye function

n

The researchers tested their technique on ex vivo mouse eyeballs and confirmed that it could produce shear wave speed maps within different structures of the eye, including the cornea, iris, lens, sclera and retina. They found that the speeds of the waves were different in eye components such as the apical region of the cornea and the pupil of the iris. That is perhaps not surprising, but they also found that the wave speeds differed within different regions of the same component, such as the apex and periphery of the cornea. This implies that apparently uniform structures in the eye have non-uniform biomechanical properties – something that may be important for their healthy function.

n

“Our technique provides valuable insights into how different components of the eye maintain their relative stiffness,” Larin says, “and the insights we have obtained will aid in developing a detailed understanding of how these components interact with each other.” Indeed, he adds that the information garnered during this work, which is detailed in the Journal of Biomedical Optics, could help researchers and clinicians analyse the biomechanical relationships within individual ocular components and among different component. This will be important for understanding overall eye function in healthy eyes, as well as for diagnosing and monitoring pathologies.

nn

“Many eye diseases and conditions, such as glaucoma, affect multiple ocular components simultaneously,” he tells Physics World. “By assessing the biomechanical properties of the whole eyeball, as in our approach, it becomes possible to detect these diseases in their early stages, even before symptoms manifest in specific components. Early detection can lead to more effective treatments and preservation of vision.”

n

What is more, monitoring how the biomechanical properties of the entire eyeball change over time is essential for tracking disease progression. “Some diseases may affect one component first before impacting others,” Larin says. “By evaluating the entire eyeball, clinicians can better assess disease evolution and make timely adjustments to treatment plans.”

n

The University of Houston team now plans to further refine and validate its modified RevOCE technique. This will involve conducting in vivo studies and exploring applications in monitoring disease progression and treatment outcomes in different animal models, and, eventually, in humans. “We are also interested in expanding its use to other areas of the body in addition to the eye – for example, for non-invasively evaluating the biomechanical properties of deep tissues,” Larin reveals.

n

The post Multi-eye-component imaging could help diagnose ocular disease appeared first on Physics World.

n

]]>
https://hadamard.com/c/multi-eye-component-imaging-could-help-diagnose-ocular-disease/feed/ 0 183
LIGO surpasses the quantum limit https://hadamard.com/c/ligo-surpasses-the-quantum-limit/ https://hadamard.com/c/ligo-surpasses-the-quantum-limit/#respond Mon, 23 Oct 2023 15:00:00 +0000 https://news.mit.edu/2023/ligo-surpasses-quantum-limit-1023 Continue reading LIGO surpasses the quantum limit]]> <p><em>The following article is adapted from a press release issued by the Laser Interferometer Gravitational-wave Observatory (LIGO) Laboratory. LIGO is funded by the National Science Foundation and operated by Caltech and MIT, which conceived and built the project.</em></p>nn<p>In 2015, the Laser Interferometer Gravitational-Wave Observatory, or LIGO, made history when it made the first direct detection of gravitational waves, or ripples in space and time, produced by a pair of colliding black holes. Since then, the U.S. National Science Foundation (NSF)-funded LIGO and its sister detector in Europe, Virgo, have detected gravitational waves from dozens of mergers between black holes as well as from collisions between a related class of stellar remnants called neutron stars.&nbsp;At the heart of LIGO’s success is its ability to measure the stretching and squeezing of the fabric of space-time on scales 10 thousand trillion times smaller than a human hair.</p>nn<p>As incomprehensibly small as these measurements are, LIGO’s precision has continued to be limited by the laws of quantum physics. At very tiny, subatomic scales, empty space is filled with a faint crackling of quantum noise, which interferes with LIGO’s measurements and restricts how sensitive the observatory can be. Now, writing in the journal <em>Physical Review X</em>, LIGO researchers report a significant advance in a quantum technology called “squeezing” that allows them to skirt around this limit and measure undulations in space-time across the entire range of gravitational frequencies detected by LIGO.</p>n<p>This new “frequency-dependent squeezing” technology, in operation at LIGO since it <a href=”https://news.mit.edu/2023/gravitational-wave-detectors-start-next-observing-run-0525″ target=”_blank”>turned back on in May of this year</a>, means that the detectors can now probe a larger volume of the universe and are expected to detect about 60 percent more mergers than before. This greatly boosts LIGO’s ability to study the exotic events that shake space and time.</p>nn<p>“We can’t control nature, but we can control our detectors,” says Lisa Barsotti, a senior research scientist at MIT who oversaw the development of the new LIGO technology, a project that originally involved research experiments at MIT led by Matt Evans, professor of physics, and Nergis Mavalvala, the Curtis and Kathleen Marble Professor of Astrophysics and the dean of the School of Science. The effort now includes dozens of scientists and engineers based at MIT, Caltech, and the twin LIGO observatories in Hanford, Washington, and Livingston, Louisiana.</p>nn<p>“A project of this scale requires multiple people, from facilities to engineering and optics — basically the full extent of the LIGO Lab with important contributions from the LIGO Scientific Collaboration. It was a grand effort made even more challenging by the pandemic,” Barsotti says.</p>nn<p>“Now that we have surpassed this quantum limit, we can do a lot more astronomy,” explains Lee McCuller, assistant professor of physics at Caltech and one of the leaders of the new study. “LIGO uses lasers and large mirrors to make its observations, but we are working at a level of sensitivity that means the device is affected by the quantum realm.”</p>nn<p>The results also have ramifications for future quantum technologies such as quantum computers and other microelectronics as well as for fundamental physics experiments. “We can take what we have learned from LIGO and apply it to problems that require measuring subatomic-scale distances with incredible accuracy,” McCuller says.</p>nn<p>“When NSF first invested in building the twin LIGO detectors in the late 1990s, we were enthusiastic about the potential to observe gravitational waves,” says NSF Director Sethuraman Panchanathan. “Not only did these detectors make possible groundbreaking discoveries, they also unleashed the design and development of novel technologies.&nbsp;This is truly exemplar of the DNA of NSF — curiosity-driven explorations coupled with use-inspired innovations. Through decades of continuing investments and expansion of international partnerships, LIGO is further poised to advance rich discoveries and technological progress.”</p>nn<p>The laws of quantum physics dictate that particles, including photons, will randomly pop in and out of empty space, creating a background hiss of quantum noise that brings a level of uncertainty to LIGO’s laser-based measurements. Quantum squeezing, which has roots in the late 1970s, is a method for hushing quantum noise or, more specifically, for pushing the noise from one place to another with the goal of making more precise measurements.</p>nn<p>The term squeezing refers to the fact that light can be manipulated like a balloon animal. To make a dog or giraffe, one might pinch one section of a long balloon into a small precisely located joint. But then the other side of the balloon will swell out to a larger, less precise size. Light can similarly be squeezed to be more precise in one trait, such as its frequency, but the result is that it becomes more uncertain in another trait, such as its power. This limitation is based on a fundamental law of quantum mechanics called the <a href=”https://scienceexchange.caltech.edu/topics/quantum-science-explained/uncertainty-principle” target=”_blank”>uncertainty principle</a>, which states that you cannot know both the position and momentum of objects (or the frequency and power of light) at the same time.</p>nn<p>Since 2019, LIGO’s twin detectors have been squeezing light in such a way as to improve their sensitivity to the upper frequency range of gravitational waves they detect. But, in the same way that squeezing one side of a balloon results in the expansion of the other side, squeezing light has a price. By making LIGO’s measurements more precise at the high frequencies, the measurements became less precise at the lower frequencies.</p>nn<p>“At some point, if you do more squeezing, you aren’t going to gain much. We needed to prepare for what was to come next in our ability to detect gravitational waves,” Barsotti explains.</p>nn<p>Now, LIGO’s new frequency-dependent optical cavities — long tubes about the length of three football fields — allow the team to squeeze light in different ways depending on the frequency of gravitational waves of interest, thereby reducing noise across the whole LIGO frequency range.</p>nn<p>“Before, we had to choose where we wanted LIGO to be more precise,” says LIGO team member Rana Adhikari, a professor of physics at Caltech. “Now we can eat our cake and have it too. We’ve known for a while how to write down the equations to make this work, but it was not clear that we could actually make it work until now. It’s like science fiction.”</p>nn<p><strong>Uncertainty in the quantum realm </strong></p>nn<p>Each LIGO facility is made up of two 4-kilometer-long arms connected to form an “L” shape. Laser beams travel down each arm, hit giant suspended mirrors, and then travel back to where they started. As gravitational waves sweep by Earth, they cause LIGO’s arms to stretch and squeeze, <a href=”https://www.ligo.caltech.edu/video/ligo20160211v6″ target=”_blank”>pushing the laser beams out of sync</a>. This causes the light in the two beams to interfere with each other in a specific way, revealing the presence of gravitational waves.</p>nn<p>However, the quantum noise that lurks inside the vacuum tubes that encase LIGO’s laser beams can alter the timing of the photons in the beams by minutely small amounts. McCuller likens this uncertainty in the laser light to a can of BBs. “Imagine dumping out a can full of BBs. They all hit the ground and click and clack independently. The BBs are randomly hitting the ground, and that creates a noise. The light photons are like the BBs and hit LIGO’s mirrors at irregular times,” he said in a <a href=”https://www.caltech.edu/about/news/at-the-edge-of-physics” target=”_blank”>Caltech interview</a>.</p>nn<p>The squeezing technologies that have been in place since 2019 make “the photons arrive more regularly, as if the photons are holding hands rather than traveling independently,” McCuller said. The idea is to make the frequency, or timing, of the light more certain and the amplitude, or power, less certain as a way to tamp down the BB-like effects of the photons. This is accomplished with the help of specialized crystals that essentially turn one photon into a pair of two <a href=”https://scienceexchange.caltech.edu/topics/quantum-science-explained/entanglement” target=”_blank”>entangled</a>, or connected, photons with lower energy. The crystals don’t directly squeeze light in LIGO’s laser beams; rather, they squeeze stray light in the vacuum of the LIGO tubes, and this light interacts with the laser beams to indirectly squeeze the laser light.</p>nn<p>“The quantum nature of the light creates the problem, but quantum physics also gives us the solution,” Barsotti says.</p>nn<p><strong>An idea that began decades ago</strong></p>nn<p>The concept for squeezing itself dates back to the late 1970s, beginning with theoretical studies by the late Russian physicist Vladimir Braginsky; Kip Thorne, the Richard P. Feynman Professor of Theoretical Physics, Emeritus at Caltech; and Carlton Caves, professor emeritus at the University of New Mexico. The researchers had been thinking about the limits of quantum-based measurements and communications, and this work inspired one of the first experimental demonstrations of squeezing in 1986 by&nbsp;H. Jeff Kimble, the William L. Valentine Professor of Physics, Emeritus at Caltech. Kimble compared squeezed light to a cucumber; the certainty of the light measurements are pushed into only one direction, or feature, turning “quantum cabbages into quantum cucumbers,” he wrote in an <a href=”https://calteches.library.caltech.edu/3746/1/Kimble.pdf” target=”_blank”>article in Caltech’s <em>Engineering &amp; Science</em> magazine in 1993</a>.</p>nn<p>In 2002, researchers began thinking about how to squeeze light in the LIGO detectors, and, in 2008, the first experimental demonstration of the technique was achieved at the 40-meter test facility at Caltech. In 2010, MIT researchers developed a preliminary design for a LIGO squeezer, which they tested at LIGO’s Hanford site.&nbsp;Parallel work done at the GEO600 detector in Germany also convinced researchers that squeezing would work. Nine years later, in 2019, after many trials and careful teamwork, <a href=”https://news.mit.edu/2019/ligo-reach-quantum-noise-wave-1205″ target=”_blank”>LIGO began squeezing light for the first time</a>.</p>nn<p>“We went through a lot of troubleshooting,” says Sheila Dwyer, who has been working on the project since 2008, first as a graduate student at MIT and then as a scientist at the LIGO Hanford Observatory beginning in 2013. “Squeezing was first thought of in the late 1970s, but it took decades to get it right.”</p>nn<p><strong>Too much of a good thing</strong></p>nn<p>However, as noted earlier, there is a tradeoff that comes with squeezing. By moving the quantum noise out of the timing, or frequency, of the laser light, the researchers put the noise into the amplitude, or power, of the laser light. The more powerful laser beams then push LIGO’s heavy mirrors around causing a rumbling of unwanted noise corresponding to lower frequencies of gravitational waves. These rumbles mask the detectors’ ability to sense low-frequency gravitational waves.</p>nn<p>“Even though we are using squeezing to put order into our system, reducing the chaos, it doesn’t mean we are winning everywhere,” says Dhruva Ganapathy, a graduate student at MIT and one of four co-lead authors of the new study. “We are still bound by the laws of physics.” The other three lead authors of the study are MIT graduate student Wenxuan Jia, LIGO Livingston postdoc Masayuki Nakano, and MIT postdoc Victoria Xu.</p>nn<p>Unfortunately, this troublesome rumbling becomes even more of a problem when the LIGO team turns up the power on its lasers. “Both squeezing and the act of turning up the power improve our quantum-sensing precision to the point where we are impacted by quantum uncertainty,” McCuller says. “Both cause more pushing of photons, which leads to the rumbling of the mirrors. Laser power simply adds more photons, while squeezing makes them more clumpy and thus rumbly.”</p>nn<p><strong>A win-win</strong></p>nn<p>The solution is to squeeze light in one way for high frequencies of gravitational waves and another way for low frequencies. It’s like going back and forth between squeezing a balloon from the top and bottom and from the sides.</p>nn<p>This is accomplished by LIGO’s new frequency-dependent squeezing cavity, which controls the relative phases of the light waves in such a way that the researchers can selectively move the quantum noise into different features of light (phase or amplitude) depending on the frequency range of gravitational waves.</p>nn<p>“It is true that we are doing this really cool quantum thing, but the real reason for this is that it’s the simplest way to improve LIGO’s sensitivity,” Ganapathy says. “Otherwise, we would have to turn up the laser, which has its own problems, or we would have to greatly increase the sizes of the mirrors, which would be expensive.”</p>nn<p>LIGO’s partner observatory, Virgo, will likely also use frequency-dependent squeezing technology within the current run, which will continue until roughly the end of 2024. Next-generation larger gravitational-wave detectors, such as the planned ground-based <a href=”https://cosmicexplorer.org/” target=”_blank”>Cosmic Explorer</a>, will also reap the benefits of squeezed light.</p>nn<p>With its new frequency-dependent squeezing cavity, LIGO can now detect even more black hole and neutron star collisions. Ganapathy says he’s most excited about catching more neutron star smashups. “With more detections, we can watch the neutron stars rip each other apart and learn more about what’s inside.”</p>nn<p>“We are finally taking advantage of our gravitational universe,” Barsotti says. “In the future, we can improve our sensitivity even more. I would like to see how far we can push it.”</p>nn<p>The <em>Physical Review X</em> study is titled “Broadband quantum enhancement of the LIGO detectors with frequency-dependent squeezing.” Many additional researchers contributed to the development of the squeezing and frequency-dependent squeezing work, including Mike Zucker of MIT and GariLynn Billingsley&nbsp;of Caltech, the leads of the “Advanced LIGO Plus” upgrades that includes the frequency-dependent squeezing cavity; Daniel Sigg of LIGO Hanford Observatory; Adam Mullavey&nbsp;of LIGO Livingston Laboratory; and David McClelland’s group from the Australian National University.</p>nn<p>The LIGO–Virgo–KAGRA Collaboration operates a network of gravitational-wave detectors in the United States, Italy, and Japan. LIGO Laboratory is operated by Caltech and MIT, and is funded by the NSF with contributions to the Advanced LIGO detectors from Germany (Max Planck Society), the U.K. (Science and Technology Facilities Council), and Australia (Australian Research Council). Virgo is managed by the European Gravitational Observatory (EGO) and is funded by the Centre National de la Recherche Scientifique (CNRS) in France, the Istituto Nazionale di Fisica Nucleare (INFN) in Italy, and the National Institute for Subatomic Physics (Nikhef) in the Netherlands. KAGRA is hosted by the Institute for Cosmic Ray Research (ICRR) at the University of Tokyo and co-hosted by the National Astronomical Observatory of Japan (NAOJ) and the High Energy Accelerator Research Organization (KEK).</p>

]]>
https://hadamard.com/c/ligo-surpasses-the-quantum-limit/feed/ 0 247
Quantum-computing protocol avoids targeting individual atoms in an array https://hadamard.com/c/quantum-computing-protocol-avoids-targeting-individual-atoms-in-an-array/ https://hadamard.com/c/quantum-computing-protocol-avoids-targeting-individual-atoms-in-an-array/#respond Fri, 20 Oct 2023 13:46:58 +0000 https://physicsworld.com/?p=110721 Continue reading Quantum-computing protocol avoids targeting individual atoms in an array]]> Quantum bits (qubits) based on cold atoms are increasingly attractive candidates for quantum computing. However, targeting single atoms in an array with lasers to manipulate them individually for processing quantum information remains a challenge. Now,  Hannes Pichler at Austria’s University of Innsbruck and Francesco Cesa, who was visiting from Italy’s University of Trieste, have designed a new protocol for quantum computation that does not rely on targeting individual atoms. Other researchers are now trying to implement the protocol in the lab.

Quantum computers should be able to perform some calculations that are beyond the capability of even the most powerful conventional supercomputers. However, the technology is still in an early phase of development and it is not clear what type of qubits are best. Today, qubits based on superconducting circuits are the most advanced – but qubits based on arrays of cold ions have also found success.

More recently, there has been increasing interest in using arrays of ultracold neutral atoms as qubits.  Atoms are attractive because they are stable, scalable, identical in nature and controllable thanks to advances in laser technologies. Atoms can be excited to Rydberg states, allowing the atoms to interact and become entangled – which is a key process in quantum computing.

Quantum adjustments

In atomic arrays, lasers form regularly-spaced optical tweezers to hold the atoms in place. Other lasers are used to adjust the quantum states of the atoms by either exciting them; nudging them to release energy and return to their ground state; or leaving the atom in a superposition of energy states. Superposition being useful for quantum computing.

The lasers that manipulate the states of the atoms typically illuminate the entire array, which makes it difficult to process quantum information held in individual atoms. However in 2022, a team of researchers in the US and UK demonstrated the targeting of single atoms with laser beams. Also that year, a team that included Pichler took a different approach by moving single atoms within an array out to target with a laser before restoring to the array.

“I am a big fan of that approach,” Pichler tells Physics World, but he adds that there could be benefits to an approach that does not require so much control of individual atoms.

Cesa agrees, “Indeed, current results on local addressing are very promising – and very exciting – but that remains one of the most delicate aspects of computation with Rydberg atoms”. He adds, “It is understood that one would prefer to use such a delicate tool as little as possible, and mostly rely on global controls”.

Strung along

In their new protocol, each qubit is a string of atoms called a wire. Each wire can exist in one of two quantum states or in a superposition of the two. Cesa explains, “at each step of the computation, the information is stored in a subset of the atoms” in each wire. This subset comprises  “interface atoms” that lie between two sections of wire made up of atoms that are ordered differently in terms of their excited and ground states. In a standard configuration, the atoms on one side of an interface alternate between the ground and excited Rydberg states and the atoms on the other side are all in the ground state.

Within a wire, an atom cannot be excited when it is within a certain distance from another excited atom – a distance called the “Rydberg blockade radius”. This means that an incident pulse will only excite the atoms on one side of the interface. Whether the first atom after the interface atom changes state depends on the state of the interface atom. In this way the interface and the information it encodes can move up the wire as the system is pulsed – or back down the wire if the pulses are inverted.

So far, the information moving up and down the line is unchanged. Change occurs when the interface atom encounters “superatoms”. These are clusters of atoms at or in-between certain sites in an array of wires that can change the state of the qubit. This effectively processes quantum information held within an array.

“You can see it as either encoding the algorithm in the [configuration of the] superatoms or in the pulse sequence that moves around your information,” Pichler explains. He adds, “I think it’s beautiful that it connects natural dynamics of quantum many body systems to quantum information processing in a very transparent way”.

Complementary protocols

Pichler points out that their protocol could complement techniques that target individual atoms “as an additional knob in designing quantum processors”. Certain processes could use the targeted approach, while other subroutines may be achieved efficiently by globally addressing the entire array. “By employing our ideas, one can drastically reduce the calls to individual atom control, and judiciously decide when to use it,” adds Cesa.

Mark Saffman at the University of Wisconsin in Madison is an expert on targeting single atoms. He describes the new protocol as an “unexpected solution for achieving universal quantum computation with globally controlled arrays of Rydberg interacting atoms”.

He told Physics World that the requirement for controlling the position and quantum state of individual atoms “puts a heavy burden on the requirements of the optical control system. The global approach by Cesa and Pichler removes that requirement, which may make the path to scalability shorter.” However, he also points out that that the “architecture does not yet incorporate error correction, which will undoubtedly be needed to reach quantum advantage for the most demanding applications”.

Pichler and Cesa agree, and they see error correction as the next key task. “This is a new way of quantum processing and it requires a new way of thinking about how to suppress errors,” says Pichler. He notes that since each qubit uses a string of atoms – not just one atom – the process might naively be considered more susceptible to errors. However, the effects of errors remain to be seen.

Cesa and Pichler have already identified features that can be exploited to help with error correction, pointing out that most of the atoms in each wire qubit do not have information associated with them. “You don’t need fully fledged quantum error correction to correct errors on this sort of idle atom,” Pichler explains.

Pichler and Cesa suggest that the protocol could also benefit other quantum-computing platforms such as those based on superconducting circuits.

The protocol is described in a paper that will appear in Physical Review Letters and in a preprint available on arXiv.

The post Quantum-computing protocol avoids targeting individual atoms in an array appeared first on Physics World.

 

]]>
https://hadamard.com/c/quantum-computing-protocol-avoids-targeting-individual-atoms-in-an-array/feed/ 0 189
Superconductivity ‘damaged’ as researchers look to move on from retractions https://hadamard.com/c/superconductivity-damaged-as-researchers-look-to-move-on-from-retractions/ https://hadamard.com/c/superconductivity-damaged-as-researchers-look-to-move-on-from-retractions/#respond Fri, 20 Oct 2023 11:30:12 +0000 https://physicsworld.com/?p=110713 Continue reading Superconductivity ‘damaged’ as researchers look to move on from retractions]]> Update 07/11/2023: The Lu-N-H paper (Nature 615 244) has since been retracted by the journal.

 

“I’m going to introduce a new material for the first time.” So said the condensed-matter physicist Ranga Dias to a packed conference room at the March meeting of the American Physical Society in Las Vegas earlier this year. The material in question was nitrogen-doped lutetium hydride, or Lu-N-H, and Dias went on to describe measurements claiming to have seen evidence for superconductivity at a remarkable 294 K (a balmy 20 °C) under a pressure of 1 GPa (10 kbar).

Based at the University of Rochester in the US, Dias claimed to have observed many signatures of superconductivity such as the electrical resistance dropping to zero at a particular transition temperature and the material expelling magnetic field lines. He and his colleagues also measured the sample’s specific heat, which showed a characteristic response at the transition temperature.

Their finding appeared to mark the culmination of a century-long quest in condensed-matter physics: the search for materials that superconduct under ambient conditions. Yet following the talk no-one spoke a word and there was no wild celebration. Dias simply finished his talk and passed the microphone over to the next speaker.

A member of the audience asked if there would be questions. “We don’t have time,” responded session chair Minta Akin from the Lawrence Livermore National Laboratory, her reply greeted with an audible groan from the room.

The atmosphere seemed very different from a previous APS March meeting in 1987 – the famous “Woodstock of physics” in New York City that took place just after the first high-temperature superconductors had been discovered.

Back then the physicists Georg Bednorz and Alex Müller had set the world of condensed-matter physics alight after discovering the year before that a material containing copper oxide, lanthanum and barium became superconducting at around 35 K. This was some 50% higher than the previous record of 23 K that had been achieved more than a decade earlier in niobium-germanium (Nb3Ge).

The new “cuprate” materials caused such a buzz because they were not metals but insulators and they offered the possibility of finding new stoichiometries and compounds that could potentially reach even higher transition temperatures.

A room-temperature superconductor was the holy grail, holding out the hope for a wide-range of applications from ultra-efficient energy grids to medical applications that require powerful magnets.

Bednorz and Müller later won the 1987 Nobel Prize for Physics for the discovery and in the decades that followed researchers created new cuprate-based compounds that reached transition temperatures of 133 K at ambient pressure and 166 K at a pressure of around 30 GPa.

 

From cuprates to hydrides

While the cuprates had been the de facto superconducting kings for the past couple of decades, that all began to change in the mid-2010s. In 2015 Mikhail Eremets and colleagues at the Max Planck Institute for Chemistry and the Johannes Gutenberg University Mainz, both in Germany, observed superconductivity at 203 K in a sample of hydrogen sulphide.

Although the material needed to be squeezed to 150 GPa (Nature 525 73), in 2018 a group led by Russell Hemley, then at George Washington University in the US, reported superconductivity at 260 K in lanthanum superhydride, albeit still under pressures of over 180 GPa, work that was published in 2019 (Phys. Rev. Lett. 122 027001).

That same year Eremets’ team reported superconductivity at temperatures up to 250 K  in lanthanum hydride at 170 GPa (Nature 569 528).

Work on these so-called binary hydrides – compounds that contain hydrogen and one other element such as hydrogen sulphide – sparked a “gold rush” in the search for high-temperature superconductors.

But what was most exciting is that they were predicted entirely from first-principles calculations, with theory agreeing almost perfectly with experiment.

Dias’ inconsiderate behaviour has harmed the reputation of the field and it may take a few years to repair the damage

nLilia Boeri

“The hydrides have probably been the single most exciting discovery in superconductivity after the cuprates, and an amazing success story of the interplay between theory and experiment,” says theoretical physicist Lilia Boeri from the University of Rome La Sapienza.

Dias and colleagues entered the high-temperature superconductivity game in 2020. Using his experience squeezing hydrogen to high pressure (see box below), Dias’s group published a paper on carbonaceous sulphur hydride that claimed to show superconductivity at 288 K under a pressure of about 260 GPa (Nature 586 373).

Around the same time Dias co-founded a company — Unearthly Materials — to commercialize room-temperature superconductors and that year the work was awarded a 2020 Physics World Breakthrough of the Year.

In 2021 Dias was even named as a TIME100 Next innovator for his work. “Let’s be clear: hoverboards, magnetic levitation trains and resistance-free power lines are not coming this year or next,” noted Time magazine. “But thanks to Ranga Dias, they’re closer than they ever were.”  

But not everything was as it seemed. In 2021 concerns were raised by researchers about some of the data processing in the paper, in particular the manner in which a background had been subtracted from the resistance measurements to show the sample falling to zero resistance after the transition temperature.

Then, in September 2022, the group’s Nature paper was retracted. “We have now established that some key data processing steps – namely, the background subtractions applied to the raw data used to generate the magnetic susceptibility plots – used a non-standard, user-defined procedure,” noted an editorial update written by the authors of the original paper.

All nine authors on the paper disagreed with the decision by Nature to retract, although the University of Rochester began three internal inquiries, two of which were completed in May 2022, and another after the retraction. Rochester announced that the investigations had found no evidence of misconduct but have not released full details of the inquiries.

Dias was undeterred and, after giving his talk at the APS meeting this year, his team’s work on Lu-N-H was published, again in Nature (615 244).

In April, a patent listing Dias as the inventor was published (although filed in April 2022) for a lutetium hydride material that can superconduct at room temperature. No details of the material’s exact stoichiometry were, however, given. But just as with the 2020 Nature paper, questions were raised around the background subtraction in the new study.

There were also concerns that the stated success rate of measuring superconductivity at high temperatures in Lu-N-H samples was only about 35%, when one would hope that all samples made to a certain recipe would be superconducting to aid reproducibility.

I still feel that hydride superconductivity has a good chance of eventually providing a superconductor at ambient conditions

nDavid Ceperley

And when other researchers tried to reproduce the findings, they failed. Di Peng from the Institute of Solid State Physics in Hefei, China, and colleagues, for example, found some signs of a transition at about 240 K, but suggest they are not indicative of superconductivity (arXiv:2307.00201).

Theorists who tried to explain the high-temperature superconductivity found themselves struggling too. Boeri and colleagues recently showed that not only could they not identify a single compound in the Lu-N-H phase diagram that could explain Dias’ extraordinary claims, but also that Lu-N-H hydrides are intrinsically low-temperature superconductors (Nature Commun. 14 5367). “There is no single theoretical paper that finds a plausible explanation for Dias’ results,” she says.

Support for Dias’s work, however, came from Hemley, who is now at the University of Illinois Chicago. Having been given material prepared by Dias’ team, Hemley and colleagues measured the electrical resistance of the samples under various pressures, finding evidence for superconductivity as high as 276 K at 15 kbar (arXiv:2306.06301).

“Our measurements are in excellent agreement with what’s reported in the Nature paper,” Hemley told Physics World. “Moreover, the magnitude of the drop is even larger than that of the earlier data.”

Hemley says that theoretical analysis he and colleagues have carried out show that the electronic structure of Lu-N-H is ”remarkable” (arXiv: 2305.18196).

“With these continued discoveries, the pursuit of superconductors that function at or even above room temperature, together with the quest for stabilizing these materials near ambient pressure, remains very exciting,” he adds.

But there was further bad news in store for Dias. On 1 September 2023 Nature published an editor’s note alerting readers that Dias’ Lu-N-H paper is being investigated.

“The reliability of data presented in this manuscript is currently in question,” Nature said. “Appropriate editorial action will be taken once this matter is resolved.”

According to a report in the Wall Street Journal in late September, eight of the 11 authors of the Lu-N-H paper had written to Tobias Rödel, a senior editor at Nature, requesting that the paper be retracted, claiming that Dias “has not acted in good faith in regard to the preparation and submission of the manuscript”.

Apparently, Rödel replied to them within a few days noting: “We are in absolute agreement with your request that the paper be retracted.” So far, the only researchers to stick to their findings are Dias and two of his current PhD students.

David Ceperley from the University of Illinois, who penned a News & Views article for Nature about the Lu-N-H results, says he is “disappointed” that Nature did not do a better job of reviewing the paper in the first place.

“We were only provided with the accepted manuscript and not the data files or referee comments,” he says. “It was only after the paper came out that we learned of some the problems that could have been found earlier.”

 

Allegations rack up for Ranga Dias

Originally from Sri Lanka, Ranga Dias graduated with a degree in physics from the University of Colombo in 2006. He then moved to the US, obtaining a PhD in 2013 from Washington State University studying materials under high pressure before doing a postdoc at Harvard University on metallic hydrogen with Isaac Silvera. Dias moved to the University of Rochester in 2017, where he began working on superconductivity in hydrides under high pressures. Apart from the controversial hydride papers (see main text), there have also been accusations of plagiarism and misconduct in other areas of his work, with James Hamlin at the University of Florida concluding Dias plagiarized as much as a fifth of his PhD thesis (Science 380 227). A spokesperson for Dias has told Science that Dias is “addressing the issues directly with his thesis adviser”. Then in August Physical Review Letters retracted a study from Dias that it had published in 2021 (127 016401) describing the electrical properties of manganese disulfide, which included a large reduction in electrical resistance under pressure. The retraction notice said that an internal investigation by four independent experts revealed “serious doubts about the origins of three of the low-temperature resistance curves”. The statement was signed by all authors except Dias, who said he “does not agree with the retraction”.

Moving on

What will happen regarding Dias’ group is unknown. In August the University of Rochester announced it is investigating Dias’ work again, although when that investigation will be complete is unknown. “Unfortunately, Dias’ inconsiderate behaviour has harmed the reputation of the field and it may take a few years to repair the damage,” says Boeri.

That view is backed by condensed-matter physicist James Hamlin from the University of Florida, who examined some of Dias’ group’s work. “I do think the whole saga is damaging to science in general, and superconductivity research more so and more broadly it’s fuel for anti-science types,” he told Physics World. “It could have an impact on funding for high pressure research and that would be unfortunate given that it’s been such a fruitful area with so many exciting recent developments.”

Hamlin also thinks that scientific research journals should broaden their communications to include all authors of the paper rather than just the corresponding author when potential research misconduct is raised. “All authors are subject to potential reputational harm from a misconduct allegation, so all authors should be privy to the relevant communications from editors from the very beginning,” he adds.

Despite these issues, work on the hydrides is progressing. In July Guangtao Liu of Jilin University, China, and colleagues found superconductivity up to 110 K at a pressure of 80 GPa in the ternary hydride LaBeH8 (Phys. Rev. Lett. 130 266001).

Although this temperature is not that high, these ternary compounds are exciting because they have a wider potential variety of structures than their binary cousins, which could expand the materials available for high-temperature superconductivity. “The field [of hydride research] is healthy and has the potential to yield many more ground-breaking results in future,” adds Boeri.

Ceperley agrees. “I still feel that hydride superconductivity has a good chance of eventually providing a superconductor at ambient conditions, which would have vast technological applications,” he notes. “The space of possible compounds and fabrication methods is so vast it may take some time to find them.”

As for Dias, he declined to comment for this article although in previous media comments he said he stood by his results.

In July Physics World even offered to publish an interview with Dias and sent a set of written questions to him via 30 Point, a US-based PR agency acting on Dias’s behalf. Despite having agreed to answer the questions, Dias later pulled out of the interview.

Physics World has since learned that 30 Point no longer works with Dias.

The post Superconductivity ‘damaged’ as researchers look to move on from retractions appeared first on Physics World.

 

]]>
https://hadamard.com/c/superconductivity-damaged-as-researchers-look-to-move-on-from-retractions/feed/ 0 190
Quantum algorithms make clever use of noisy hardware https://hadamard.com/c/quantum-algorithms-make-clever-use-of-noisy-hardware/ https://hadamard.com/c/quantum-algorithms-make-clever-use-of-noisy-hardware/#respond Thu, 19 Oct 2023 14:57:55 +0000 https://physicsworld.com/?p=110704 Continue reading Quantum algorithms make clever use of noisy hardware]]> While quantum computers show great promise for the future, today’s processors are small and noisy – and this makes it very difficult to do meaningful quantum calculations right now. To address this problem, researchers are developing clever quantum algorithms that make the most out of the hardware that is available today.

Some of those algorithms are being developed by UK-based Phasecraft and the firm’s co-founder and chief technology officer is our guest in this episode of the Physics World Weekly podcast. Toby Cubitt explains why the company is focussing on the development of quantum algorithms for calculating the properties of materials and how these algorithms can be run on today’s noisy hardware.

Cubitt also talks about career opportunities in quantum computing and explains why he believes that quantum computers could soon be solving scientifically relevant problems.

The post Quantum algorithms make clever use of noisy hardware appeared first on Physics World.

 

]]>
https://hadamard.com/c/quantum-algorithms-make-clever-use-of-noisy-hardware/feed/ 0 192