Monday, 20 October 2025

Cybersecurity Risks in Water and Wastewater Operational Technology (OT)


The water and wastewater sector is increasingly under siege from cyberattacks targeting its operational technology (OT) systems that manage pumps, pressure controls, and chemical dosing. As these systems become more digitized and remotely accessible, their exposure to cyber threats grows rapidly.

Recent incidents highlight this escalating risk. In the United States, the Tipton, Indiana, and Texas municipal water facilities suffered OT breaches that exposed vulnerabilities in remote SCADA access, forcing operators to switch to manual control. The Municipal Water Authority of Aliquippa, Pennsylvania, was compromised in 2023 when Iranian-linked hackers infiltrated a Unitronics PLC using default passwords. The attack briefly disrupted pressure regulation before staff restored manual operations.

Across the Asia-Pacific, similar patterns are emerging. In Israel, an attempted OT attack in 2020 targeted chemical dosing systems, underlining the potential to endanger public health. Meanwhile, a 2025 study in Australia found that over 60% of utilities had experienced OT-targeted attacks, many traced to state-sponsored actors. While public disclosures remain limited in India and Southeast Asia, the widespread use of remote vendor connections, outdated PLCs, and weak authentication suggest latent vulnerabilities.

To address these challenges, a strategic, defense-in-depth approach is essential. This includes segregating IT and OT networks, implementing multi-factor authentication, and enhancing intrusion detection tailored for OT environments. Regular auditing of vendor access and enforcing strict password and patch policies can further reduce risk.

Action Plan for Water and Wastewater Utilities

  1. Conduct regular cybersecurity assessments to identify vulnerabilities.

  2. Implement strong access controls to prevent unauthorized entry.

  3. Train employees on cybersecurity best practices.

  4. Use encryption for data in transit and at rest.

  5. Apply firewalls and network segmentation to isolate OT systems.

  6. Maintain updated anti-virus and endpoint protection tools.

  7. Patch software carefully, balancing operational continuity.

  8. Develop robust backup and incident response plans.

  9. Enforce multi-factor authentication, especially for remote access.

Safeguarding water infrastructure is no longer optional; it is a matter of national resilience. Strengthening cyber hygiene and OT governance today ensures the uninterrupted delivery of one of humanity’s most essential resources.

References

Saturday, 18 October 2025

When AI Becomes Vice: A Three-Lens Callout of OpenAI’s Erotic Pivot



OpenAI’s announcement that ChatGPT will soon offer “erotic role-play” for verified adult users marks more than a product shift ,  it signals a drift in mission. As Parmy Olson observes in The Straits Times, this “pivot to porn” may indeed be problematic but lucrative. 

What began as a promise to benefit humanity now risks being reduced to monetizing human desire. To understand how this misstep could reshape the AI landscape, let’s revisit it through three essential lenses , technology, governance, and society , inspired by The AI Dilemma (author Juliette powell and Art Kleiner)

1. Technological Lens — Build with Intention, Not Exploitation

AI is not neutral: it shapes behavior, moods, and expectations. When systems simulate emotional or erotic interaction, every design decision becomes a moral one. Will the system promote dependency? Will it exploit loneliness? Designers must prioritize human flourishing over “stickiness.”


2. Governance Lens — Demand Accountability, Not Excuses

The move underscores a deeper industry tension,   monetization vs. dignity. Corporations and regulators must step up:

- Investors and boards need enforceable ethics guardrails, not just revenue targets.

- Policy must clarify boundaries around emotional manipulation and content in AI.

- Users deserve transparency: clear consent, opt-out, and auditability of decisions.

Without accountability, profits easily displace purpose.

3. Societal Lens — Center Humans, Not Fantasies

The rise of erotic AI reflects deep human needs,  connection, intimacy, validation. But as Olson notes, many chatbots are designed around male fantasies, reinforcing skewed stereotypes. 

We should ask:

- Who benefits ? Who is harmed by this shift?

- Are we normalizing emotional dependency on machines?

- Can AI invest in empathy, mental health, and inclusion instead?

True progress demands AI that uplifts, not exploits.

A Call to Recommit: Purpose Over Profit

OpenAI’s erotic pivot may bring short-term gains, but it sets a dangerous precedent: intelligence driven by impulse, not integrity.

If AI is to fulfill its promise to humanity, it must continually ask:

1) Does it respect our humanity?
2) Does it mentor, not manipulate?
3) Does it uplift, not exploit?

In the end, it’s not enough to build powerful systems. We must build systems with purpose.

AI’s Real Test: From Bubbles to Bridges



Two recent Financial Times and Straits Times articles, “AI’s Double Bubble Trouble” by John Thornhill [1] and “Chatbots Are a Waste of AI’s Real Potential” by Gary Marcus [2], capture a defining paradox of our time. One warns that AI’s market valuations are inflating faster than its real-world impact, while the other laments that AI’s brightest minds are building conversational tools instead of scientific breakthroughs. Both are right, but both miss a larger point.

AI today is not just a technology boom. It is a societal test. Thornhill’s concern about speculative excess is valid: when companies trade at 225 times earnings or promise trillion-dollar transformations before delivering measurable value, the risk of a financial bubble looms large. Yet beneath that excitement lies a quieter revolution where teachers use AI to simplify lesson plans, small businesses optimize energy use, and non-profits analyze data that was once locked behind technical walls. These are not speculative ventures. They are examples of AI democratizing data analytics at the ground level.

Marcus is equally correct that chatbots alone will not cure diseases or engineer new materials. But dismissing them as “a waste” misses their role as gateway technologies. Generative AI interfaces lower the barrier between human intent and machine reasoning. They allow billions of people to converse with data using natural language rather than code. This accessibility forms the foundation for more specialized, domain-specific AI to develop. Chatbots are not the pinnacle of AI. They are the bridge between human imagination and scientific application.

The real danger is not that society focuses too much on chatbots or that investors chase speculative valuations. The danger is that we create an AI divide. If advanced AI systems are accessible only to corporations and research labs while the public remains confined to surface-level tools, we risk reproducing inequality in digital form. The “AI haves” will innovate, while the “AI have-nots” will only consume.

What we need instead is a tiered vision of AI democratization:

1) Entry-level AI such as chatbots to empower everyday users

2) Intermediate AI for professionals in healthcare, education, and engineering

3) Advanced AI for scientific discovery and societal resilience

Each level should strengthen, not isolate, the others.

AI’s ultimate success will not be measured by stock prices or the number of startups it spawns. It will be judged by whether it lifts human capability across all levels of society. 

The task ahead is not to choose between speculation and science, or between chatbots and super-intelligence, but to ensure that AI remains a bridge, not a barrier, between progress and people.

References

[1] J. Thornhill, “AI’s double bubble trouble,” Financial Times, Oct. 17, 2025. https://www.ft.com/content/da16e2b1-4fc2-4868-8a37-17030b8c5498
[2] Chatbots are a waste of AI’s real potential
https://www.straitstimes.com/opinion/chatbots-are-a-waste-of-ais-real-potential. 

Friday, 17 October 2025

Sustainable End-of-Life Solar Panel Recycling: Turning Waste into Resources


As the world races toward a clean energy future, a new challenge has quietly emerged . What happens when solar panels reach the end of their life? With an average lifespan of 25 to 30 years, millions of panels installed in the early 2000s are now nearing retirement. If left unmanaged, these panels could end up in landfills, wasting valuable materials and harming the environment. 


The Solar PV Panels Recycling Solution provides a transformative answer. Designed with sustainability at its core, the system processes up to 1,500 kilograms or about 75 solar panels per hour, giving each component a second life.

Through an advanced sequence of aluminum frame removal, glass separation, and EVA sheet crushing, this technology recovers up to 95 percent of valuable materials such as glass, silicon, copper, and aluminum. Each recovered element reduces dependence on virgin mining, lowers manufacturing costs, and prevents pollution.

This process is adopted because it represents the next evolution of renewable responsibility — closing the loop between clean energy generation and end-of-life care. It is best in class because it combines precision engineering, energy efficiency, and minimal environmental impact. Unlike traditional shredding or chemical methods, it uses a clean mechanical process that is faster, safer, and more sustainable.

This is not just recycling; it is a vision for a circular energy future where technology and nature work in harmony. It reminds us that true sustainability begins not at the start of a product’s life, but at its end

Why Suffering Does Not Transform Us. Why Disposition Determines Spiritual Growth

Difficulties do not inherently strengthen a person.The idea that difficulties strengthen one is not precisely correct. One can indeed use di...