Sunday, May 18, 2025

Polaris of Enlightenment

Ad:

What I wish I knew about privacy sooner

The hard truths no one warned me about.

Published 22 March 2025
– By Naomi Brockwell

I’ve been deep in the privacy world for years, but I wasn’t always this way. If I could go back, I’d grab my younger self by the shoulders and say: “Wake up. The internet is a battlefield of people fighting for your attention, and many of them definitely don’t have your best interests at heart”.

I used to think I was making my own decisions—choosing what platforms to try, what videos to watch, what to believe. I didn’t realize I was part of a system designed to shape my behavior. Some just wanted to sell me things I didn’t need—or even things that actively harm me. But more importantly, some were paying to influence my thoughts, my votes, and even who I saw as the enemy.

There is a lot at stake when we lose the ability to make choices free from manipulation. When our digital exhaust—every click, every pause, every hesitation—is mined and fed into psychological experiments designed to drive behavior, our ability to think independently is undermined.

No one warned me about this. But it’s not too late—not for you. Here are the lessons I wish I had learned sooner—and the steps you can take now, before you wish you had.

1. Privacy mistakes compound over time—like a credit score, but worse

Your digital history doesn’t reset—once data is out there, it’s nearly impossible to erase.

The hard truth:

  • Companies connect everything—your new email, phone number, or payment method can be linked back to your old identity through data brokers, loyalty programs, and behavioral analysis.
  • Switching to a new device or platform doesn’t give you a blank slate—it just gives companies another data point to connect.

What to do:

  • Break the chain before it forms. Use burner emails, aliases, and virtual phone numbers.
  • Change multiple things at once. A new email won’t help if you keep the same phone number and credit card.
  • Be proactive, not reactive. Once a profile is built, you can’t undo it—so prevent unnecessary links before they happen.

2. You’re being tracked—even when you’re not using the internet

Most people assume tracking only happens when they’re browsing, posting, or shopping—but some of the most invasive tracking happens when you’re idle. Even when you think you’re being careful, your devices continue leaking data, and websites have ways to track you that go beyond cookies.

The hard truth:

  • Your phone constantly pings cell towers, creating a movement map of your location—even if you’re not using any apps.
  • Smart devices send data home at all hours, quietly updating manufacturers without your consent.
  • Websites fingerprint you the moment you visit, using unique device characteristics to track you, even if you clear cookies or use a VPN.
  • Your laptop and phone make hidden network requests, syncing background data you never approved.
  • Even privacy tools like incognito mode or VPNs don’t fully protect you. Websites use behavioral tracking to identify you based on how you type, scroll, or even the tilt of your phone.
  • Battery percentage, Bluetooth connections, and light sensor data can be used to re-identify you after switching networks.

What to do:

  • Use a privacy-focused browser like Mullvad Browser or Brave Browser.
  • Check how unique your device fingerprint is at coveryourtracks.eff.org.
  • Monitor hidden data leaks with a reverse firewall like Little Snitch (for Mac)—you’ll be shocked at how much data leaves your devices when you’re not using them.
  • Use a VPN like Mullvad to prevent network-level tracking, but don’t rely on it alone.
  • Break behavioral tracking patterns by changing your scrolling, typing, and browsing habits.

3. Your deleted data isn’t deleted—it’s just hidden from you

Deleting a file, message, or account doesn’t mean it’s gone.

The hard truth:

  • Most services just remove your access to data, not the data itself.
  • Even if you delete an email from Gmail, Google has already analyzed its contents and added what it learned to your profile.
  • Companies don’t just store data—they train AI models on it. Even if deletion were possible, what they’ve learned can’t be undone.

What to do:

  • Use services that don’t collect your data in the first place. Try ProtonMail instead of Gmail, or Brave instead of Google Search.
  • Assume that if a company has your data, it may never be deleted—so don’t hand it over in the first place.

4. The biggest privacy mistake: Thinking privacy isn’t important because “I have nothing to hide”

Privacy isn’t about hiding—it’s about control over your own data, your own life, and your own future.

The hard truth:

  • Data collectors don’t care who you are—they collect everything. If laws change, or you become notable, your past is already logged and available to be used against you.
  • “I have nothing to hide” becomes “I wish I had hidden that.” Your past purchases, social media comments, or medical data could one day be used against you.
  • Just because you don’t feel the urgency of privacy now doesn’t mean you shouldn’t be choosing privacy-focused products. Every choice you make funds a future—you’re either supporting companies that protect people or ones that normalize surveillance. Which future are you contributing to?
  • Anonymity only works if there’s a crowd. The more people use privacy tools, the safer we all become. Even if your own safety doesn’t feel like a concern right now, your choices help protect the most vulnerable members of society by strengthening the privacy ecosystem.

What to do:

  • Support privacy-friendly companies.
  • Normalize privacy tools in your circles. The more people use them, the less suspicious they seem.
  • Act now, not when it’s too late. Privacy matters before you need it.

5. You’re never just a customer—you’re a product

Free services don’t serve you—they serve the people who pay for your data.

The hard truth:

  • When I first signed up for Gmail, I thought I was getting a free email account. In reality, I was handing over my private conversations for them to scan, profile, and sell.
  • Even paid services can sell your data. Many “premium” apps still track and monetize your activity.
  • AI assistants and smart devices extract data from you. Be intentional about the data you give them, knowing they are mining your information.

What to do:

  • Ask: “Who profits from my data?”
  • Use privacy-respecting alternatives.
  • Think twice before using free AI assistants that explicitly collect your data, or speaking near smart devices.

Final thoughts: The future isn’t written yet

Knowing what I know now, I’d tell my younger self this: you are not powerless. The tools you use, the services you fund, and the choices you make shape the world we all live in.

Take your first step toward reclaiming your privacy today. Because every action counts, and the future isn’t written yet.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Specialist doctor warns: Social media is hacking our brains

The rise of technocracy

Published yesterday 12:08
– By Editorial Staff
In 15 years, eating disorders have almost doubled - largely attributed to morbid trends on social media.

Users are not the customers of social media giants – they are the product itself, where the most important thing is to capture our attention for as long as possible and at any cost.

This is the conclusion of psychiatrist Anders Hansen, who points to TikTok as an example of a platform that “creates information that our brains cannot look away from”.

On the Swedish public television channel SVT, he explains what happens to users’ brains when they use social media and how harmful much of the content actually is – especially for young users.

– We humans want to belong to a group at any cost. It’s pure survival. We constantly ask ourselves: ‘Am I good enough for the group, am I attractive enough, or smart enough, or thin enough?’ When we are exposed to this two to three hours a day, we perceive that we are not good enough, that we are not worthy.

– Our brains register this as a threat to our survival, which is why it makes us feel so bad. Some people then try to do something about it, such as starving themselves… These are deeply biological mechanisms within us that are being hacked by this extremely advanced and sophisticated technology.

“The companies don’t care”

Although it is actually prohibited on most platforms to target weight loss tips and similar ideals of thinness to children and young people, this is still very common.

Although eating disorders are a complex illness with many potential causes, Hansen says it cannot be ignored that they have almost doubled since 2010 across the Western world – and that this is likely due to the ideals promoted on social media.

– Companies don’t care if you develop a distorted self-image, they just want to squeeze every last second out of you. If you think about it, maybe you can awaken your inner rebel and not let companies take up your time, he explains.

Profit lost from restrictions

The psychiatrist also points out that all types of regulations and restrictions on algorithms and content mean that users will spend less time on social media – and that this is why social media companies systematically oppose such requirements.

– They have no interest whatsoever in trying to stop this.

Although TikTok is highlighted as the clearest example, there are now a long list of competitors that work in a similar way – including Instagram Reels (Meta), YouTube Shorts (Google), and Snapchat Spotlight.

Lock down your Mac

No Apple ID, no tracking, no nonsense.

Published yesterday 8:16
– By Naomi Brockwell

Apple markets itself as a privacy-focused company. And compared to Google or Microsoft, it is. But let’s be clear: Apple is still collecting a lot of your data.

If you want the most private computer setup, your best option is to switch to Linux. Not everyone is ready to take that step though, and many might prefer to keep their existing computer instead.

If you want to keep your current device but make it more private, what are your options?

Windows is basically a privacy disaster. Privacy expert Michael Bazzell says in his book Extreme Privacy:

I do not believe any modern Microsoft Windows system is capable of providing a secure or private environment for our daily computing needs. Windows is extremely vulnerable to malicious software and their telemetry of user actions is worse than Apple’s. I do not own a Windows computer and I encourage you to avoid them for any sensitive tasks”.

If you want to keep your Mac without handing over your digital life to Apple, there are ways to lock it down and make it more private.

In this article, I’ll walk you through how to set up a Mac for better privacy—from purchasing the computer to tweaking your system settings, installing tools, and blocking unwanted data flows.

We’ll be following the setup laid out by Michael Bazzell in Extreme Privacy, with some added tips from my own experience.

We also made a video tutorial that you can follow along.

You don’t need to do everything. Each chapter is modular. But if you follow the full guide, you’ll end up with a Mac that doesn’t require an Apple ID, doesn’t leak constant data, and gives you control over your digital environment.

Buying your Mac

Choose a model that still gets security updates

Apple eventually drops support for older devices. A privacy-hardened system isn’t useful if it doesn’t receive security updates.

Two helpful sites:

Pay with cash in a physical store

If you buy a Mac with a credit card, the serial number is forever linked to your identity.
Cash keeps you anonymous. You might get strange looks, but it’s completely within your rights. Be polite. Be firm. They’ll grumble. That’s fine.

Fresh install of macOS

If it’s a refurbished Mac—or even brand new—it’s worth doing a clean install.

Update macOS

  • System Settings > General > Software Update
  • Install updates, reboot, and reach the welcome screen.

Erase all content

  • System Settings > General > Transfer or Reset > Erase All Content and Settings
  • Enter your password, confirm warnings
  • Your Mac will restart and erase itself

This restores factory defaults: user data and settings are gone, but the OS remains installed.

Optional: Wipe the disk completely (advanced)

If you want a truly clean install, you’ll need to manually erase the entire internal disk. Only do this if you’re comfortable in recovery mode.

Modern Macs split the system into two parts—a sealed system volume and a data volume—tied together with something called firmlinks. If you don’t erase both correctly, you can end up with phantom volumes that clog your disk and break things silently.

Steps:

  • Enter Recovery Mode:
    • Apple Silicon: Hold power > click “Options”
    • Intel: Hold Command + R on boot
  • Open Disk Utility
  • Click View > Show All Devices
  • Select the top-level physical disk (e.g., “Apple SSD”)
  • Click Erase
    • Name: Macintosh HD
    • Format: APFS
    • Scheme: GUID Partition Map

Warning: Skip “Show All Devices” or erase the wrong item and you could brick your Mac. Only do this if you understand what you’re doing.

Once erased, return to the recovery menu and choose Reinstall macOS.

First boot setup

macOS wants to immediately link your device to iCloud and Apple services. Stay offline as long as possible.

Setup tips:

  • Region: Choose your location
  • Accessibility: Skip
  • Wi-Fi: Click “Other Network Options” > “My computer does not connect to the internet”
  • Data & Privacy: Continue
  • Migration Assistant: Skip (we’re starting fresh!)
  • Apple ID: Choose “Set up later”
  • Terms: Agree
  • Computer Name: Use a generic name like Laptop or Computer
  • Password: Strong and memorable. No hint. Write it down somewhere safe.
  • Location Services: Off
  • Time Zone: Set manually
  • Analytics: Off
  • Screen Time: Skip
  • Siri: Skip
  • Touch ID: Optional
  • Display Mode: Your choice

Harden system settings

Wi-fi & bluetooth

  • System Settings > Wi-Fi: Turn off
    • Disable “Ask to join networks” and “Ask to join hotspots”
  • System Settings > Bluetooth: Turn off

Firewall (built-In)

  • System Settings > Network > Firewall: Turn on
    • Disable “Automatically allow built-in software…”
    • Disable “Automatically allow downloaded signed software…”
    • Enable Stealth Mode
    • Remove any pre-approved entries

Notifications

  • System Settings > Notifications
    • Show Previews: Never
    • Turn off for Lock Screen, Sleep, and Mirroring
    • Manually disable for each app

Sound settings

  • System Settings > Sound
    • Alert Volume: Minimum
    • Disable sound effects and interface feedback

AirDrop & sharing

  • System Settings > General > AirDrop & Handoff: Turn everything off
  • System Settings > General > Sharing: Disable all toggles

Siri & Apple Intelligence

  • System Settings > Siri & Dictation: Disable all
  • Disable Apple Intelligence and per-app Siri access

Switch time server

Your Mac pings Apple to sync the time—leaking your IP every time it does.
Switch to a decentralized time server instead.

How:

  • System Settings > General > Date & Time
  • Click “Set…” > Enter password
  • Enter: pool.ntp.org
  • Click Done

Spotlight & gatekeeper

Spotlight

  • System Settings > Spotlight: Turn off “Help Apple improve search”

Gatekeeper

Gatekeeper prevents you from opening non-Apple-approved apps and sends app data to Apple.

If you’re a confident user, disable it:

  • Terminalsudo spctl --master-disable
  • System Settings > Privacy & Security: Allow apps from anywhere

FileVault & lockdown mode

FileVault

Encrypt your entire disk:

  • System Settings > Privacy & Security > FileVault: Turn on
  • Choose “Create a recovery key and do not use iCloud”
  • Write down your recovery key. Store it OFF your computer.

Lockdown mode (Optional)

Restricts features like USB accessories, AirDrop, and others. Useful for high-risk users.

Customize appearance & finder

Desktop & dock

  • Disable “Show Suggested and Recent Apps”
  • Disable “Recent apps in Stage Manager”

Wallpaper

Use a solid color instead of version-specific defaults to reduce your system’s fingerprint.

Lock screen

  • Screensaver: Never
  • Require password: Immediately
  • Sleep timer: Your preference (e.g. 1 hour)

Finder preferences

  • Show all file extensions
  • Hide Recents and Tags
  • Set default folder to Documents
  • View hidden files: Shift + Command + .

Block outbound connections

macOS and many apps connect to servers without asking. You’ll want to monitor and block them.

Use Little Snitch (or LuLu)

Browser

Install a privacy-respecting browser like Brave or Mullvad.

Compare options at privacytests.org

VPN

Use trusted providers like Mullvad or ProtonVPN.

Be careful which VPN you download — they’re often scamware and data collection tools.
Watch this video for more

Optional: Use Homebrew

Instead of the App Store, install software via Homebrew.

We’ll cover this more in a future guide.

Final takeaways

If you followed this guide, you now have:

  • A Mac with no Apple ID
  • No iCloud tether
  • Full disk encryption (FileVault)
  • A silent firewall
  • Blocked outbound connections
  • A private browser and VPN setup

You’ve taken serious steps to reclaim your digital autonomy. Well done.

In an upcoming guide, we’ll explore how to take the next step: switching to Linux.

Thanks again to Michael Bazzell for his work.

Find his book Extreme Privacy at: inteltechniques.com/book7.html

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

Your therapist, your doctor, your insurance plan – now in Google’s ad system

Blue Shield exposed 4.7 million patients’ private health info to Google. Your most private information may now be fueling ads, pricing decisions, and phishing scams.

Published 10 May 2025
– By Naomi Brockwell

The healthcare sector is one of the biggest targets for cyberattacks—and it’s only getting worse.

Every breach spills sensitive information—names, medical histories, insurance details, even Social Security numbers. But this time, it wasn’t hackers breaking down the doors.

It was Blue Shield of California leaving the front gate wide open.

Between April 2021 and January 2024, Blue Shield exposed the health data of 4.7 million members by misconfiguring Google Analytics on its websites. That’s right—your protected health information was quietly piped to Google’s advertising systems.

Let’s break down what was shared:

  • Your insurance plan name and group number
  • Your ZIP code, gender, family size
  • Patient names, financial responsibility, and medical claim service dates
  • “Find a Doctor” searches—including provider names and types
  • Internal Blue Shield account identifiers

They didn’t just leak names. They leaked context. The kind of data that paints a detailed picture of your life.

And what’s worse—most people have become so numb to these data breaches that the most common response is “Why should I care?”

Let’s break it down.

1. Health data is deeply personal

This isn’t just a password or an email leak. This is your health. Your body. Your medical history. Maybe your therapist. Maybe a cancer screening. Maybe reproductive care.

This is the kind of stuff people don’t even tell their closest friends. Now imagine it flowing into a global ad system run by one of the biggest surveillance companies on earth.

Once shared, you don’t get to reel it back in. That vulnerability sticks.

2. Your family’s privacy is at risk—even if it was your data

Health information doesn’t exist in a vacuum. A diagnosis on your record might reveal a hereditary condition your children could carry. A test result might imply something about your partner. An STD might not just be your business.

This breach isn’t just about people directly listed on your health plan—it’s about your entire household being exposed by association. When sensitive medical data is shared without consent, it compromises more than your own privacy. It compromises your family’s.

3. Your insurance rates could be affected—without your knowledge

Health insurers already buy data from brokers to assess risk profiles. They don’t need your full medical chart to make decisions—they just need signals: a recent claim, a high-cost provider, a chronic condition inferred from your search history or purchases.

Leaks like this feed that ecosystem.

Even if the data is incomplete or inaccurate, it can still be used to justify higher premiums—or deny you coverage entirely. And good luck challenging that decision. The burden of proof rarely falls on the companies profiling you. It falls on you.

4. Leaked health data fuels exploitative advertising

When companies know which providers you’ve visited, which symptoms you searched, or what procedures you recently underwent, it gives advertisers a disturbingly precise psychological profile.

This kind of data isn’t used to help you—it’s used to sell to you.
You might start seeing ads for drugs, miracle cures, or dubious treatments. You may be targeted with fear-based campaigns designed to exploit your pain, anxiety, or uncertainty. And it can all feel eerily personal—because it is.

This is surveillance operating in a very predatory form. In recent years, the FTC has cracked down on companies like BetterHelp and GoodRx for leaking health data to Facebook and Google to power advertising algorithms.

This breach could be yet another entry in the growing pattern of companies exploiting your data to target you.

5. It’s a goldmine for hackers running spear phishing campaigns

Hackers don’t need much to trick you into clicking a malicious link. But when they know:

  • Your doctor’s name
  • The date you received care
  • How much you owed
  • Your exact insurance plan and member ID

…it becomes trivially easy to impersonate your provider or insurance company.

You get a message that looks official. It references a real event in your life. You click. You log in. You enter your bank info.
And your accounts are drained before you even realize what happened.

6. You can’t predict how this data will be used—and that’s the problem

We tend to underestimate the power of data until it’s too late. It feels abstract. It doesn’t hurt.

But data accumulates. It’s cross-referenced. Sold. Repackaged. Used in ways you’ll never be told—until you’re denied a loan, nudged during an election, or flagged as a potential problem.

The point isn’t to predict every worst-case scenario. It’s that you shouldn’t have to. You should have the right to withhold your data in the first place.

Takeaways

The threat isn’t always a hacker in a hoodie. Sometimes it’s a quiet decision in a California boardroom that compromises millions of people at once.

We don’t get to choose when our data becomes dangerous. That choice is often made for us—by corporations we didn’t elect, using systems we can’t inspect, in a market that treats our lives as inventory.

But here’s what we can do:

  • Choose tools that don’t monetize your data. Every privacy-respecting service you use sends a signal.
  • Push for legislation that treats data like what it is—power. Demand the right to say no.
  • Educate others. Most people still don’t realize how broken the system is. Be the reason someone starts paying attention.
  • Support organizations building a different future. Privacy won’t win by accident. It takes all of us.

Control over your data is control over your future—and while that control is slipping, we’re not powerless.

We can’t keep waiting for the next breach to “wake people up.” Let this be the one that shifts the tide.

Privacy isn’t about secrecy. It’s about consent. And you never consented to this.

So yes, you should care. Because when your health data is treated as a business asset instead of a human right, no one is safe—unless we fight back.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

PoX: New memory chip from China sets speed record

Published 7 May 2025
– By Editorial Staff
Fudan engineers are now working on scaling up the technology and developing new prototypes.

A research team at Fudan University in China has developed the fastest semiconductor memory reported to date. The new memory, called PoX, is a type of non-volatile flash memory that can write a single bit in just 400 picoseconds – equivalent to about 25 billion operations per second.

The results were recently published in the scientific journal Nature and unlike traditional RAM (such as SRAM and DRAM), which is fast but erases data in the event of a power outage, non-volatile memory such as flash retains stored information without power. The problem has been that these memories are significantly slower – often thousands of times – which is a bottleneck for today’s AI systems that handle huge amounts of data in real time.

The research team, led by Professor Zhou Peng, achieved the breakthrough by replacing silicon channels with two-dimensional Dirac graphene – a material that allows extremely fast charge transfer. By fine-tuning the so-called “Gaussian length” of the channel, the researchers were able to create a phenomenon they call two-dimensional superinjection, which allows effectively unlimited charge transfer to the memory storage.

Using AI‑driven process optimization, we drove non‑volatile memory to its theoretical limit. This paves the way for future high‑speed flash memory, Zhou told the Chinese news agency Xinhua.

“Opens up new applications”

Co-author Liu Chunsen compares the difference to going from a USB flash drive that can do 1,000 writes per second to a chip that does a billion – in the same amount of time.

The technology combines low power consumption with extreme speed and could be particularly valuable for AI in battery-powered devices and systems with limited power supplies. If PoX can be mass-produced, it could reduce the need for separate caches, cut energy use and enable instant start-up of computers and mobiles.

Fudan engineers are now working on scaling up the technology and developing prototypes. No commercial partnerships have yet been announced.

– Our breakthrough can reshape storage technology, drive industrial upgrades and open new application scenarios, Zhou asserts.