Before I only used monitor, simple DP/HDMI input is all I wanted. But being able to take full control of the tv and connect it with other devices in the house I would normally get Rpi for is pretty convenient!
Plus, I'm pretty confident they are already doing illegal things. On my Samsung TV it wants to force update. There is no decline option, there is no option to turn off updates, only to take it completely offline. There's no way in hell these kinds of contracts would be legal in any other setting. There's no meaningful choice and contracts that strongarm one party are almost always illegal. You can't sign a contract where the bank can arbitrary change the loan on you (they can change interest but they can't arbitrarily charge how that interest is determined. Such as going from 1% to 1000% without some crazy impossible economic situation).
Someone needs to start a class action. Someone needs to push that as far as the courts will go
The bigger problem here is tivoization. You can build a fresh kernel from source but you have no way to install it because the bootloader is locked down.
https://sfconservancy.org/blog/2021/mar/25/install-gplv2/ https://sfconservancy.org/blog/2021/jul/23/tivoization-and-t... https://events19.linuxfoundation.org/wp-content/uploads/2017...
In the email you have linked to, he does not support tivoization. He simply says that he finds the term offensive (which is really funny coming from him).
Torvalds has also publicly stated that he doesn't think that tivoization benefits users, but it's not his battle to fight. More info on that topic can be found in the linked YT (linked at the precise time he is answering the question about tivoization, but the whole video is about GPL v2 vs GPL v3).
YT video: https://youtu.be/PaKIZ7gJlRU?si=RK5ZHizoidgVA1xO&t=288
Imagine you make a smart toaster, and you make it entirely out of open source software. You release all the changes you made too, complying fully with the spirit of open source. People could take your software, buy some parts and make their own OSS toasters, everything’s great.
But for safety reasons, since the software controls when the toaster pops, you decide to check at boot time that the software hasn’t been modified. You could take the engineering effort to split the software into parts so that only the “pop on this heat level” part is locked down, but maybe you’re lazy, so you just check the signature of the whole thing.
This would be a gpl3 tivoization violation even though the whole thing is open source. You did everything right on the software end, it just so happens that the hardware you made doesn’t support modifying the software. Why is that a violation of a software license?
This is what makes no sense to Linus, and TBH it makes no sense to me either. Would the toaster be a better product if you could change the software? Of course. But it seems to be an extreme overreach for the FSF to use their license (and that “or any later version” backdoor clause) to start pushing their views on the hardware world.
Nothing is stopping the "hardware world" from developing their own operating system. But as long as they choose to come as guests to the FSF/GPL party, partake of the snacks and fill their glasses at the free-refills fountain, they're expected to abide by the rules. The doors not locked, they can leave any time.
As arguments go, this is a pretty weak one considering how obvious the solution is: Make the manufacturer not be liable for what happens when you operate the device with unauthorized software.
I understand this discussion as being about how society should deal with it, not how you could try to make the argument internal to a company.
And, you know, it works great. It's simple to operate and (so far!) has been completely reliable.
I can hack on it in any way that I want to. There's no aspect of it that seeks to prevent that kind of activity at all.
What would I hack it to do instead? Who knows, but I can think of a couple of things. Maybe instead of having some modes where the elements are in series, I want them in parallel instead so the combination operates at higher power. Maybe I want to bypass the thermostat with an SSR and use my own control logic so I can ramp the temperature on my own accord and finally achieve the holy grail of a perfect slice of toast, and make that a repeatable task.
Whatever it is, it won't stand in my way of doing it -- regardless of how potentially safe or unsafe that hack may be.
There are countless examples of similar toaster ovens in the world that anyone else can hack on if they're motivated to do so, and very similar 3-knob Black & Decker toaster ovens are still sold in stores today.
And yet despite the profoundly-accessible hackability of these potentially-dangerous cooking devices (they didn't even bother to weld the cover on or use pentalobular screws, much less utilize one-way cryptographic coding!), they seem fine. They're accepted in the marketplace and by safety testing facilities like Underwriters Laboratories, who seem satisfied with where the bar for safety is placed.
Why would a toaster oven (or indeed, just a pop-up toaster) that instead used electronic controls need the bar for safety to be placed at a different height?
It wouldn't. It's a thought experiment. I even said:
> Would the toaster be a better product if you could change the software? Of course.
The point is, nobody should be compelling you to make your products hackable. If you don't want to, that's your prerogative.
The problem is, before GPLv3 existed, the authors that picked GPLv2 never expressed that they wanted their software to be part of some anti-locked-bootloader manifesto... they picked it because GPLv2 represents a pretty straightforward "you can have the source so long as you keep it open for any changes you make" license. That's what the GPL was. But this whole "Or any future version" clause gave FSF carte blanche to just alter the deal and suddenly make it so anyone can fork a project and make it GPLv3. I can perfectly understand why this would make people (including Linus) very mad.
And that's why Torvalds left out "or any future version" when licensing Linux. So I'm not sure why he's "very mad" (I doubt he actually is?); his software remains on GPLv2 like he wanted.
> The point is, nobody should be compelling you to make your products hackable.
If you want to use my GPLv3 software on your product, then yes, I am requiring that you make it hackable. If you don't want to do that, tough shit. Either do so, or freeload off someone else's software.
> The point is, nobody should be compelling you to make your products hackable. If you don't want to, that's your prerogative.
I agree.
Nobody is compelled to use GPLv3 code in the appliances that they want locked-down for whatever reasons (whether good or bad) they may wish to do that. There's other routes (including writing it themselves).
They may see a sea of beautiful GPLv3 code and wish they could use it in any way they desire, like a child may walk into a candy store and wish to have all of it for free, but the world isn't like that.
We're all free to wish for whatever we want, but that doesn't mean that we're going to get it.
> But this whole "Or any future version" clause gave FSF carte blanche to just alter the deal and suddenly make it so anyone can fork a project and make it GPLv3.
This "Or any future version" part isn't part of the GPL -- of any version.
Let us review GPL v1: https://www.gnu.org/licenses/old-licenses/gpl-1.0.en.html
> Each version is given a distinguishing version number. If the Program specifies a version number of the license which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of the license, you may choose any version ever published by the Free Software Foundation.
The GPL itself does not in any way mandate licensing any code under future versions. An author can elect to allow it -- or not.
If they specify GPL 2, then they get GPL 2. Not 3. Not 4. Only 2.
Other versions of the GPL are ~the same in this way. (You know where to find them, right? They're easy reads.)
Products have worked this way since forever. Only since modern microprocessors and cryptography have evil companies been able to deliberately add roadblocks that are impossible to overcome (without replacing so much hardware that you've made a new toaster from scratch) in order to maximize revenue. This is predatory and should be illegal. The only reason I can see that you'd support this, is it you work for a company that makes a lot of money selling new toasters to replace broken ones, and if this is true, your company deserves to be shut down by the government.
"For safety reasons" is every single claim. For safety reasons, I want to block the manufacturer's software from doing what it wants. Why do the manufacturer's safety reasons overrule my safety reasons?
> This would be a gpl3 tivoization violation even though the whole thing is open source.
Copyleft has nothing to do with open source. You haven't done everything right on the software end, because the GPL isn't about helping developers. To do things right on the software end, you should keep GPL software out of your locked down device that you are using to restrict the freedom of its users.
> Would the toaster be a better product if you could change the software? Of course.
You just said that it would be an unsafe product if you could change the software. Now you're using the "don't let the perfect be the enemy of the good" trope to pretend that you would of course support software freedom in an ideal, magical, childish, naïve dream world.
> it seems to be an extreme overreach for the FSF to use their license
People can license their software how they want. Is it an extreme overreach for Microsoft to not let you take their Windows code and do whatever they want with it? Why are you even thinking about GPL code when there's so much overreach coming from Adobe? They don't let you use their code under any circumstances!
All of your reasoning is motivated, and I would recommend that people not buy your toaster.
If you're afraid that modifying the software will make the toaster overheat, then include a hardware thermal fuse. You need to anyway, in case the manufacturer software fails or the processor fails.
A software author has the right to set terms for use of their software, including requiring that manufacturers provide end users certain freedoms.
Why would I want a Linux computer with a huge screen?
I just want a huge screen.
I’ll provide my own connected devices, independent of the screen.
But for many people, we just want a monitor, maybe with speakers (I personally am fine also separating this).
I prefer separation of concerns — if I want to attach a computer to my TV, I’ll do that as a search device.
Why have a dependency on the TV hardware, when I can attach upgradable parts?
> you can make it a PC and then turn it off
TV manufactories can get the best of both worlds: The people that want smart TVs, get a smart TV. The people that don't want a smart TV, can disable the smart TV features. Manufactors make one model and sell to both market segments.
Why should your preferences impose on the ones that don't want what you want? I guess the preferred way would be for manufactors to have add a feature where the tv prompts you if you want to enable smart features when you boot the tv for the first time, but it's a bit difficult when manufactors get more money when they have these features enabled by default.
The problem is that I can’t have my preference: a TV that comes without (non-essential) software installed.
This means I have no choice but to deal with required updates — or at the very least, an annoying reminder that software updates are needed — for software I never wanted in the first place.
If the software was optional — could be uninstalled, or disabled so that updates weren’t required — then I would agree with you that having all TVs be smart TVs would be fine.
But not only is it not optional, it often comes with dark patterns of imposed privacy violations and/or unwanted ads.
The OP’s solution is to “jailbreak” it with a Linux install, which the average consumer doesn’t know how to do.
Again, is fine for hackers that want to tinker with things, but the whole point of the linked article is that many people are tired of smart TVs and the annoyances that come with them.
If you want to buy a bare LCD panel, they're cheap. But you're going to have to add a processor to it that runs an OS (which you're free to write yourself, along with the driver) in order for it to understand any input. All that slapped together is what we call a monitor, or a television.
If you want an analog television, they'll pay you to haul it off from wherever you see it, but you're going to have to add an external computer to it in order to process the digital information that you want to display into waveforms that you can push over coaxial cables.
Not wanting a "smart tv" means people don't want a smartphone for a television, an OS that they don't have any control over. If you want to make up another definition, you're going to have to set limits to acceptable RAM, clock speed, number of processors, and I don't know why you would waste your time doing that. The number, however, will never be zero for any of these things.
They make fixed-function chips in factories every day that do stuff like convert video signals from one format to another (including formats that LCD panels can deal with).
Like the TFP401. For illustration, here is one on a board, ready to plug into an LCD panel and use for whatever: https://www.adafruit.com/product/2218
It doesn't run an OS. It's barely even programmable, and the programmability it does have relates only to configuring pre-defined hardware functions. It doesn't have an instruction set. It can't add 1+1.
But it can bridge the gap between a consumer device that produces video and a fairly bare LCD panel. It's a very much a single-tasker.
(Do any of the current crop of consumer-oriented televisions and computer monitors use this kind of simple pathway? Most assuredly not, which is the complaint that brought us here to begin with.
But these pathways exist anyway. It's completely possible to to create an entire video display and house it in a nice-looking package, put it in a retail box, and sell it on store shelves without involving an operating system. It's not a technological limitation.)
If already needs a computer in it to drive menus / modern display protocols. Having that computer be powerful enough to also decode content is barely an extra cost.
It's all relative of course, maybe you view anything you can Ghidra as not-black-box. (though this is kind of tangential to rooting - for a many/most devices you can get a hold of the blobs to reverse engineer without rooting anything)
Because I can then easily upgrade my computer without upgrading my TV.
This entire subthread is not computer-literate. Your monitor contains a computer. A dumb display contains a computer. Your keyboard contains a computer.
You can strip the software down on them so they do nothing but take commands and drive whatever electronics you have attached to them, but it will still be software on a computer. If there's a lot of RAM and a fat processor, like on a rooted smart TV, I might (but not necessarily) make it do a little more than that.
The same reason I don't want anything else in my life to be a computer. A computer is one more component that can fail and take down the whole product. I want my computer to be a computer and that's it.
For me, it seems so much simpler to keep the two separate. You won't be forced to wash the heating element every time you wash the cup. Can't heat a different cup while the other is in the dishwasher, unless all your cups are self-heating. Normally, the only way for a cup to break is if it shatters, but with an inbuilt heater there's electronics that can break too. And should the cup shatter, now the heater is unusable too, or vice versa.
I have to have a kettle for other purpose (including heating water for other mugs than mine), and no self-heating mug is going to be as efficient as a kettle to heat water.
Furthermore, I also put cold or room temperature liquids in my mug. With a self-heating one, I would be carrying the heating parts for absolutely no reason.
Same goes for a TV. By keeping things separated, I can decide what I do which each device and manage their lifecycle separately. If the device reading video files is included in the TV, I can't plug it to another TV or a projector or even take it with me to use it elsewhere. While I've upgraded three times my video playing device to follow tech evolution, I've kept the same TV to plug them in.
It is fair to observe a separation methodology, but I also have to say, in some cases multi-purpose devices have their place.
If, say, the self-heating mug involved solar harvesting, I'd put a couple in my kettle bag, for sure.
You can make coffee with a kettle, but if you are making enough coffee often enough, it does make sense to bundle a second kettle into a dedicated coffeemaker, even if you are reducing the functionality of it by doing so.
But as a "power user" of a TV, I want to compose my own setup.
In the same way, "power users" of coffee don't use a coffeemaker. They use things like French press.
(I use instant coffee myself in my non-heating mug so in this comparison I would be the person not owning a TV and watching everything on their phone?)
As a perpetual intermediate, I find that a pour-over cone is a great balance of convenience and quality.
This applies less for some physical items, I know some people are already preparing to explain why it’d be harder to make or dangerous or something but that would miss the point. Computers are incredibly easy to swap out, we already have so many ways of doing that.
Maybe I want a fast computer. None. Maybe I want to upgrade later. Maybe in a year there’s a faster cheaper one. Maybe mine is just fine right now but I need a new screen. Why do I need to bundle the two things together? There’s a simplicity for users unboxing something but there’s not (I think) an enormous blocker to having something interchangeable here.
This provides absolutely zero advantages to the oven or to the microwave. It does cause a lot of stupid, easily foreseeable problems:
- There's only one control panel, and if the oven is currently active, some of the microwave controls get disabled.
- The microwave is awful in various ways -- regardless of whether the oven is active -- which wouldn't ordinarily be a problem, because microwaves are very cheap. But...
- It's impossible to replace the microwave, a $50 device, without simultaneously replacing the oven, a $2000 device.
Example: watching a movie but want the live score of a sports match scraped from a public website to be displayed in a corner.
OR while watching a sports match -- i want a overlay feed of text from a chat stream for a select web source
Looking forward for some public experiments / open projects in this space i could leverage. Dont have the skills to attempt it myself from scratch.
I mean, I didn't wanted to buy a brand new one anyway, it's very expensive and I don't need latest AI features. I found a year old model with firmware that was listed as supported by the jailbreak at the time
Global Panel Exchange Center
That's like Alibaba, except for small(ish) quantities of LCDs of any possible description.
What you buy is what companies put out into the world.
This “proprietary telemetry” is basically malware, just, it was put on the thing at the factory. Once a system is fully rooted by malware, the least-bad option is to nuke it entirely and install from scratch.
In this context where the locked-down device probably also doesn’t have a fully open source kernel and drivers, this becomes a bit tricky. Better just to use a device that doesn’t have malware on it in the first place.
Alternatively, you can plug in a Raspberry Pi that runs steam link :)
It's the same system software, just with root capacity.
That being said, there's still a bunch of nice homebrew:
- Video screensavers ala Apple TV
- DVD logo screensaver
- Adfree (and sponsorblock-integrated and optional shorts-disabling) Youtube
- Remote button remapping (Netflix button now opens Plex for me)
- Hyperion (ambilight service that controls an LED strip behind the TV)
- A nice nvidia shield emulator for game streaming from my PC with low latency
- VNC server (rarely useful, but invaluable when it is)
Sponsorblock and remote remapping are killer features for me, and the rest is just really pleasant to have.
I tried a 48" TFT-type television (attempting use as a computer display) and the refresh rate just wasn't there, along with typical backlight splotching (but it cost a fifth as much, so...).
My only caution is OLED can experience burn-in (unlike the smaller Aorus 45" using a VA-type panel), but it is otherwise a much better experience
I would rather have a quality large display with speakers and DP than a TV. The only argument in favour of buying a large TV for coding is cost.
The other limitation is lower brightness than miniLED monitors, around 30-60% of the nits in SDR. Whether that matters obviously depends on the ambient light or reflective surfaces near you.
For me, because I'm next to a big window and already squinting at my 400 nits IPS monitor, a < 300 nits OLED is a non-starter, but a 600 nits in SDR, IPS miniLED, is ideal.
This limitation should be temporary however because there are some high nit OLED TVs coming on the market in 2025 so bright OLED 27-43" monitors will likely follow.
Its also easier to implement, if I recall correctly
This is the essential core of it, as I have come to understand it anyway.
Wikipedia [0] states:
> VESA, the creators of the DisplayPort standard, state that the standard is royalty-free to implement.
And VESA's website [1] lists Samsung, Sony and LG as being members already, so they've already paid. What am I missing here?
Edit: It is cool I can plug my phone or laptop into the TV with one cable, no adapters, and get some power as well. For some reason it didn't work with my Steam Deck which was strange.
I can't be the only one that hooks up my computer, with a graphics card, to my TV
DisplayPort is a standard. A DisplayPort port is a port that follows the DisplayPort standard.
Issues with HDCP support maybe?
[0]: This came up recently with Valve: https://news.ycombinator.com/item?id=46220488
Granted, I suspect quite strongly the next wave of consolidation is going to continue the trend of being around USB-C, since the spec should have the bandwidth to handle any video / audio protocols for quite some time. Matter of time until that happens IMO.
It also lets you have a single cord that could theoretically be your power cord and your A/V cord.
Display port already allows multiple video streams, ausiostreams ... Why do we need a closed standard to also do this?!?!
AMD doesn't (can't? won't?) do the same but there is a workaround: a DisplayPort to HDMI adapter using a particular chip running hacked firmware. That'll get you 4K 120 Hz with working FreeSync VRR.
https://forum.level1techs.com/t/it-is-possible-to-4k-120-hdr...
Westinghouse TVs are made by a company licensing the brand, not a "Pittsburgh-headquartered company".
These seem like easy mistakes to avoid.
That just has to be an LLM at work.
The whole article is pretty terrible.
Also, personal experience: My own ISP (in Germany) experimented with some similar stuff a few years ago: They mandated use of their own home routers where only they had root access. At some point, they pushed an OTA update that made the router announce a second Wifi network in addition to the customer's. This was meant as a public hotspot that people walking down the street could connect to after installing an app from the ISP and buying a ticket.
The customer that "owned" the router wasn't charged for that traffic and the hotspot was isolated from the LAN (or at least the ISP promised that), but it still felt intrusive to just repurpose a device sitting in my living room as "public" infrastructure.
(The ISP initially wanted to do this on an "opt-out" basis, which caused a public uproar thankfully. I think eventually they switched to opt-in and then scrapped the idea entirely.)
Not sure if you're referring to Vodafone, but Vodafone Germany definitely does this. You can opt out of allowing public access via your personal router, but this opts you out of being able to use other people's routers in the same manner.
It was fairly well implemented I think: separated from your network, bandwidth was limited (to avoid impacting the host), you could opt-out (which meant opting out of using the guest network), joining the wifi was automatic if you had a cellphone with the same ISP and it was the same "guest" network for all routers so in big cities, you could rely only on this to access Internet.
It was stopped a few years ago when they deemed cellular network was reliable enough to not need the guest network.
(That was years before the other incident - since then they had dropped that idea and "generously" given customers access to the config UI)
You might be interested to read about the findings by Ruter, the publicly owned transport company for Oslo. They discovered their Chinese Yutong electric buses contained SIM cards, likely to allow the buses to receive OTA updates, but consequentially means they could be modified at any moment remotely. Thankfully they use physical SIMs, so some security hardening is possible.
Of course, with eSIMs becoming more widespread, it’s not inconceivable you could have a SoC containing a 5G modem with no real way to disable or remove it without destroying the device itself.
[1] https://ruter.no/en/ruter-with-extensive-security-testing-of...
This Kindle did not have things like idle-screen advertising. That wasn't yet a thing yet.
These first edition devices were available with unlimited data access (IIRC in the US via AT&T) on cellular networks without a separate subscription. It was slow (everything was slow back then), but it would let a person download a book or have a look at a web page (with the very limited browsing that was possible with e-ink and a CPU that was meant more to barely sip power than to render megabytes of CSS and JS).
The expense of the data access was built into the one-time purchase price, and the hope was that people having the ability to buy books from "anywhere" would snowball into a thing that was both very popular and profitable.
It was simple and, functionally at least, it worked very neatly: Take new Kindle out of the box, switch it on, and download a book with it. No wifi or PC connection or other tomfoolery needed.
That was back in 2007 -- a time when many people still had landlines at home if they wanted to make a phone call, or a dumb phone in their pocket if they wanted to do that on-the-go. Some folks had Blackberries or connected Palm devices, but those things were rare.
And the Internet, and indeed Amazon itself, was a very different place back then. Having an Internet connection that was very quietly always available on a Whispernet-equipped Kindle was pretty cool at that time.
---
Sidewalk is a different kind of network. It uses consumer devices (like Echo Dot speakers) to act as Sidewalk bridges. This generally works at a low frequency (900MHz-ish), to provide a bit of relatively slow, relatively long-range wireless network access for things that are otherwise lacking it.
The present-day operation works like this: Suppose I've got some Amazon Echo speakers scattered around my house. If a neighbor's Internet connection is on the fritz, then their Ring doorbell can use a tiny slice of my Internet bandwidth using Sidewalk via one of my Echo speakers to keep itself connected to the network and thereby still function as a doorbell.
Or, maybe their Ring doorbell is out on a post by the gate, where their wifi coverage sucks. If it can gather up a little slice of 900MHz Internet access from anyone's near-enough Sidewalk bridge, then they've still got a button for their gate that notifies them on their pocket supercomputer when some visitor is waiting out there. They don't even necessarily need to plan it this way in order for it to Just Work.
Or, what GP was referring to: Your hypothetical new smart TV might use the neighbors' Sidewalk-enabled device(s) to update or patch itself, produce new ads to show you, and/or send telemetry back home to Mother. It might do this even without you ever having deliberately connected it to any network at all.
---
Either thing (some modern equivalent to Whispernet, or the already-loose-in-the-wild Sidewalk system) could potentially be utilized by smart TVs and other devices to get access to the network and simply sidestep the oft-repeated, well-intended, and somewhat naive mantra of "It can't have Internet access if you never connect it!"
https://electronics.sony.com/tv-video/televisions/television...
A family member's TV came with it.
Every Android-based media player I've had tried just plain sucks, the NVIDIA Shield wasn't too bad but at some point the controller quit charging. You can still get a game console with a built-in Blu-Ray player too and it's nice to have one box that does that as well as being an overpowered for streaming.
I have a HDHomeRun hooked up to a small antenna pointed at Syracuse which does pretty well except for ABC, sometimes I think about going up on the roof and pointing the small one at Binghamton and pointing a large one at Syracuse but I am not watching as much OTA as I used to. It's nice though being able to watch OTA TV on either TV, any computer, tablets, phones, as well as the Plex Pass paying for the metadata for a really good DVR side-by-side with all my other media.
As for TVs I go to the local reuse center and get what catches my eye, my "monitor" I am using right now is a curved Samsung 55 inch, I just brought home a plasma that was $45 because I always wanted a plasma. I went through a long phase where people just kept dropping off cheap TVs at my home, some of which I really appreciated (a Vizio that was beautifully value engineered) and some of which sucked. [1]
[1] ... like back in the 1980s everybody was afraid someone would break into your home and take your TV but for me it is the other way around
I also use this for occasional gaming, or I would've stuck with the PS3 Slim or PS4 Slim. Both of which would mount pretty nicely, with a VESA bracket, to the back of a pre-smart formerly top-of-the-line 1080p Sony Bravia TV (like I use currently with the PS5 Slim).
Were I not in minimalism culling mode of personal belongings right now (in case the current job search moves me cross-country), I'd be stockpiling a backup or two of this workhorse dumb-TV.
Top of the line 2025 65" Sony Bravia 8 OLED: $2,300
Plasma TVs not mounted to the wall are also concerning for safety reasons. A plasma TV tipped onto a young child would likely cause significant injury above and beyond what an LCD would cause.
I loved my plasma but I feel like the weight was a real issue.
> According to its privacy policy, the company gathers usage data, such as “data about your activity on and use of” Apple offerings, including “app launches within our services…; browsing history; search history; [and] product interaction.” [...] transaction information, account information (“including email address, devices registered, account status, and age”), device information (including serial number and browser type), contact information (including physical address and phone number), and payment information (including bank details).
Yeah, sure, that's privacy, Ars.
[1]https://arstechnica.com/gadgets/2025/06/all-the-ways-apple-t...
1. Email address - you have to use an email address to have an Apple account. How are they not going to have your email?
2. Devices registered - you mean when you log into your device, they keep track of your logged in devices!
3. Transaction history - they keep track of what you bought from them!
Must I continue? Every single piece of data that you named is required to do business with them.
Also 'product interaction' is an euphemism to say "if you're sick, we'll sell this information for around 80€" (I think it's close to 200$ for Americans but I don't have any contact in this industry overseas). If you have a cancer and suddenly you see an increase in ads for pseudo-medicine and other scams whose only goal is to extract all the money you have left, and if lucky, your famil's money too, that's from 'product interaction'.
You can give Apple any age you want to. It’s not like it checks.
And I have no idea about the other topics you are going off on and what they have to do with Apple..
Chrome and Firefox do the same.
But what privacy threat are you trying to avoid by not having your bookmarks and history synced in an E2E manner where Apple can’t see either?
Many tens to hundreds of dollars for that single datapoint is incredible. I have naively assumed we were just packaged up in aggregate and never thought more deeply than that.
What are the most valuable data? Pregnant? Wedding? Divorce? Illness? Home purchase?
They want to show you things you have recently watched or looked at when you log in, rather than just random TV shows.
> Age?
You can give your kids an age-restricted account so what they watch is limited.
(/s).
This isn’t the iPod days where you would sync your watch history with iTunes.
At the end of the day, they could be taking screenshots of everything you do with your TV and argue it's because of some AI system that will allow you to more easily launch whatever it is you normally do at that time of the day. If you do not see any issue with that, why would you be on this thread?
Apple tracks what you are watching on AppleTV only.
I’m on this thread because I understand technology.
Are you saying that if you are watching something like “South Park” you wouldn’t want the service that you are watching it on to keep track of where you are in its 25 season run?
So the solution they propose to TVs that track what you're watching is to switch to AppleTV where Apple will track what you're watching? And you still justify this somehow?
How else are there going to mark what you watched and whdfd you are in a TV series?
You still do not get it: you can find a pseudo-justification for _every_ type of tracking they do to you. But none of these are really true justifications. You can do _everything_ without any type of tracking -- even the very basic premise: it shouldn't even be true that you need an account _at all_ to use an Apple TV.
How could an AppleTV or any device connected to an HDMI port know what you are watching on other input sources?
The AppleTV device doesn’t track what you watch at all. The AppleTV+ service knows what you watch on their service.
Their is no justification for the TV to know anything. There is obviously a reason for each service to know what you watch on their service. What exactly are you arguing? That you should be able to use the AppleTV+ service anonymously?
Obviously it only records what you watch through it.
> A smart TV can track what you watch no matter which input source you are using. How could an AppleTV or any device connected to an HDMI port know what you are watching on other input sources?
I thought the entire point was to _use_ the Apple TV. If you buy the Apple TV, but still use the other HDMI ports for your viewing .... why did you buy the Apple TV in the first place?
> The AppleTV+ service knows what you watch on their service.
And if you use the Apple TV, what you watch through Apple TV's TV program.
> Their is no justification for the TV to know anything.
Of course there is. They will claim this way it remembers your favorite channel, or that then they can send you spam^W updates in the schedule of your favorite programs, or whatever other crap people like you eventually end up thinking as an indispensable feature for which they happily accept tracking for.
> There is obviously a reason for each service to know what you watch on their service. What exactly are you arguing? That you should be able to use the AppleTV+ service anonymously?
That _there is_ a way to do broadcast TV anonymously. You do not need accounts, sync between multiple devices, or anything; and even if you need them, there are alternatives. That you are in error when you think that your pseudo-justifications are worth anything more than the ones Samsung will provide. The fact that that you immediately jump from "I need this" to "Therefore service provider must be able to track everything I do" is telling.
That’s completely not true. Are you claiming that Apple intercepts what other apps are doing when you run them?
The apps voluntarily integrate what you are currently watching and in the middle of watching with the AppleTV app so you can get a consolidated view. Apple isn’t going in and monitoring any more than it’s monitoring your email when it is sent to you.
I can't help wonder if they are just afraid of the offering looking more bare, or is this really such an uncommon desire to want to see "new to me" stuff and not repeat things?
I did the same last year though when I couldn’t find a good non-smart tv. Even if you don’t like the advice it is a practical solution for normies.
The only Apple “ads” I ever see are inside the Apple TV+ app (yeah, their naming is confusing…) and it’s only for TV shows they’re promoting in their streaming service.
That’s very different from turning on your TV and seeing an ad for Mercedes or whatever taking up the screen.
They won’t actually let you delete the Apple TV app, but if you move it out of the top row you will never see the ads.
My parents have an Amazon Fire TV and when I go to their house and have to use it it drives me insane. Carousels of adds large at the top, banner ads as you scroll, full rows of sponsored apps. Full screen ads for random Amazon products when you pause any show you are watching. Everything you watch on Amazon’s streaming service has minute long unskippable ads. Sometimes when you turn it on Alexa will just verbally read you ads.
It’s truly a dystopian piece of tech.
Just feels the best that it's not a commercial product, rather a project built by cool people.
> If you want premium image quality or sound, you’re better off using a smart TV offline.
In the future, if they add e-sims, we'll just remove them or de-solder or whatever.
The real risk is cars: if they start not working without cell network connections.
If we continue giving money to people who build malware into the products, the malware will eventually be baked in deeply enough that the rest of the device will refuse to operate if it can't phone home to the ministry of truth or wherever.
I let my latest LG TV on the network, but block internet access at the router. HomeKit integration (Siri turn off tv), Chromecast, Airplay, and other local services all work, without the ability for it to phone home.
Given how limited cell service is in a lot of the US, I think we're a ways off from this.
But also, it's unlikely I'll live long enough where keeping an older vehicle won't be an option.
There’s just no reason for this. You have one job: Take my signal and display it. Anything else is just another place for things to go wrong.
This is commonly repeated and but as far as I can tell nobody has actually demonstrated it.
Buy a cheap smart TV and run it in "store mode".
Brightness and saturation will probably be maxed out but with a cheap TV, it looks more like "normal" on a more expensive model. Hint: The main difference between cheap and expensive in some cases --- the color adjustment range is limited by software on the cheaper models.
Currently using a Hisense 4k model from Costco connected to a small mini PC --- Windows or Linux, your preference. The TV functions as nothing but a dumb display.
Use a small "air mouse" for control. On screen keyboard as needed.
Use a Hauppauge USB tuner for local digital broadcasts.
I use software called DVB Viewer to view local channels and IPTV. A browser with VPN for streaming in some cases.
In every case, I maintain full control of my data and the ability to block ads as I see fit.
They aren't "cheap," but just last week I unboxed and tested 5 different Samsung S95F televisions of 4 different sizes.
One of the functions that each of them promised to perform when set to "retail mode" was to reset the picture settings every 5 minutes.
That makes retail mode a non-starter for anyone who seeks any resemblance of accuracy in their video system, at least on these particular televisions.
seems on the cheaper side and it might work like he said
Why does it have to be cheap? What if I want a killer panel without all the bs?
> Use a small "air mouse" for control
An alternative is something like 'unified remote' on it, then you can even type from your phone without any pain.
> A browser with VPN for streaming in some cases.
There is a missing piece for me here. A magic 'send my PC browser tab to this other PC connected to the TV' button. Not sure if something like this exists. It would be ideal to send all the browser context with cookies etc so that you are logged in too and can just start playing whatever you found on PC.
Any for of cast is not an option, rendering has to happen on the TV PC box.
It doesn't have to be --- but you may be wasting your money if you run in "store mode".
As noted above, "store mode" will usually max out the brightness, saturation and contrast while removing user control. This looks pretty "normal" with cheaper models. More expensive ones can become overbearing.
It appears to me that in some cases, the difference between cheap and more expensive is mainly the color adjustments.
In order to take advantage of economies of scale, they may use the exact same screen panel on multiple different models but limit the cheaper ones in software so it doesn't look as "bright" and "eye catching" in the store as their more expensive "killer" model.
Chromecast does exactly this and has existed since ~2010.
You can send a tab to another device on Firefox. It doesn't come with all the browser context, but it's pretty handy.
I use an NVIDIA shield on a dumb TV with firefox sideloaded (ad blockers, ect) for 95% of my streaming. You can import your cookies or other preferences or simply browse for content directly.
That probably mimics Samsung TVs, which are popular for that reason but look like crap.
The actual best TVs, picture wise, are among the LG C series, which are surprisingly dim and unsaturated. That said, mine has held up terribly so I won't buy another. My $200 Onn looks good enough to my eyes and lasted longer.
But I argue for these projects to have a long tail, they need revenue.
A few have tried by selling hardware, but it never lands mainstream enough.
I'd read reports that Q-Symphony (audio from the TV speakers and soundbar simultaneously) wouldn't work, but it does.
I stuck an OSMC (https://osmc.tv/) box to the back of both of them so they can play stuff from my NAS. They're not the cheapest solution and I realise Kodi/XBMC on which they're based isn't everyone's jam (I grew up with XBMC on an Xbox so it is very much mine) - but they play everything, have wifi, HDMI-CEC, integrated RF remote, and work out of the box.
Model numbers if anyone cares: Samsung QE65S95C, Samsung QE77S95F. I believe S95, S90 and S85 (at least up to F) are all very similar so they should all work but ofc ymmv.
I wouldn't recommend Kodi for streaming, it kinda works but the experience isn't great. I use it exclusively for playing stuff from my server full of legally acquired public domain videos (ahem).
I do watch YouTube videos on it, but I use TubeArchivist (basically a fancy wrapper for yt-dlp) to pull them onto the server first, and a script to organise them into nicely-named directories.
I’m using a Minix Z100 running Gnome and Kodi. I use a simple Bluetooth keyboard, the interface is clunky but it does the job. I use Samba to also share files to VNC running on iOS and Android on the same network.
I tried using fancier solutions but anything that browses content without involving directories always break for some specific content in unpredictable ways.
An alternative could be some x86 Android TV build like Lineage, but I have not seen very convincing demonstrations that this is truly viable.
I just think of them as the best solution to run Kodi for media that is on my network.
It felt illegal.
But kidding aside, who are we even really kidding anymore, even if you were provided the TOS would you simply not use the device of there were something in the TOS you disagreed with? How about when you’ve been using the device and all the sudden they change the TOS and force agreement as you are about to start a tv evening with the family?
The people simply accepted their enslavement, the taking of your agency, because we all allowed or were overwhelmed with it.
They take our agency through process just like they’ve taken our freedom and rights in so many different ways, just like through YC funded Flock, where treasonous mass surveillance cameras just show up over night and most here seem unaware it’s a YC company that now provides a mass surveillance network to the government and global government tightening its noose around humanity’s neck.
Now that it's connected, it shows an ad at that time, in the same way. Can't win.
Source, my open test network and a neighbors tv that keeps trying to phone home with it.
This is what the article recommends by the way.
Or blacklist the TV's MAC address in your router settings. Didn't think of that first for some reason.
So no remote. I get up, hit the spacebar to pause/play. The audio is into a multi-channel receiver though so audio has mute/volume controls on a remote.
I get that people would rather have a remote but I personally actually don't like remotes at all. My TV is basically a screen only.
As far as I know there are no remotes that work with MacOS.
> A spokesperson from Panasonic Connect North America told me that digital signage displays are made to be on for 16 to 24 hours per day and with high brightness levels to accommodate “retail and public environments.”
Some TV's err on the side of being too dim for daytime viewing in a bright room; that could only be a plus.
If it's too bright in a way that can't be turned down, you could always DIY a tinted shield to put over it for evening viewing. We used to use things like that over CRT monitors once upon a time.
> Their rugged construction and heat management systems make them ideal for demanding commercial use, but these same features can result in higher energy consumption, louder operation, and limited compatibility with home entertainment systems.
I've never heard a commercial flat screen display make a sound.
> Panasonic’s representative also pointed out that real TVs offer consumer-friendly features for watching TV, like “home-optimized picture tuning, simplified audio integration, and user-friendly menu interfaces.”
That person doesn't understand how this would be used at all. The user hooking up their streaming box to the display panel only needs the panel to do video (e.g. via HDMI cable). The display is not involved in audio at all.
I use a 1/8" plug stereo cable going straight from the Android box to a pair of RCA jacks in the speaker system. Bluetooth could be used but the wire has lower latency, 100% reliability, and not using BT means that the speakers are available for pairing if someone wants to use them from a phone. They have a remote control that can switch between two copper line inputs, and BT. The TV's volume is kept at 1%; it would make no difference if it had no speakers.
If I need to update an app, I temporarily allow Google services access. All the streaming apps work well, except for HBO Max which takes a few minutes to load. I suspect it has a long timeout/retry count for something I'm blocking. But once it loads, it's fine.
I also use a different and basic home launcher so we can open the apps we want immediately, without having to deal with shifting algorithm-based icons. But even if we use the Google launcher, it's mostly empty and free of ads because it can't connect. It does still capture what I recently watch though.
Overall it's a decent experience, mainly because we're not being bombarded by more ad algorithms.
A minor problem is that it displays "Turning on AI voice features" every time I turn it on, but those features are not actually turned on. It probably tries to, but since I never connected the TV to the internet, this fails. Still have to figure out how to get rid of the message.
[1] https://www.bestbuy.ca/en-ca/product/lg-50-ua7000-4k-uhd-hdr...
They were cheap and the picture quality is great. Not OLED level, but jeeze I had to share a 27” CRT for my SNES as a kid—
But also I pretty much never use the TV button to turn it on, I click a button on one of the connected devices to wake it and the TV turns itself on with that input selected. Even if it’s already on, if I want to switch from one device to another I can just wake the other device and it will switch inputs for me. It works really well, I almost never have to use the input selector and it just does the right thing reliably.
I’m a happy camper. Newer stuff would feel like a downgrade. Couldn’t care less about 4K video. Never wilfully seen sought it out, unimpressed when I see it.
I still go to the cinema regularly, alone. Something very deliberate about going to a place to pay a price to go into a specific space to do a specific thing. Pandering to second screen audiences has produced some of the most profoundly insulting media in living memory.
Cars from around 1998-2014 usually have side curtain airbags & adequate rollover durability. The only improvements since then that I'd even want at all are better EV batteries & marginal efficiency gains for IC engines, but those can be retrofitted &/or aren't worth the anti features they also added IMO.
If car companies want my business they'll have to remove the telemetry & automatic updates.
I don't care if I end up paying more to drive an old car eventually, but this approach has also been saving me money so far.
FWIW I have two 2018 models with zero “smart” features.
Of course if you are one of those drivers who removes their hands from the wheel in a stressful situation (there are many), these systems will help somewhat.
My barber and grocery store is a $9 Uber Ride each way. So I could get away with a car easily where I live now. My wife and I have been down to one car since Covid.
But when I was in the burbs if metro Atlanta where everything wasn’t so close, it would have been over $100 easy going from one side to the other or basically anywhere besides the grocery store.
My car insurance is only $176 a month for my wife and I. It doesn’t make sense not to have a car, even if you include the minor maintenance on a car that would be hardly ever driven. Even at a theoretical $400 car payment + $176 in insurance, it still easy to come out ahead.
I live in a tourist area where there are a lot of drivers causing the prices to be low. I noticed it in Las Vegas too.
The only reason I know is I use Uber to run errands close by when my wife has the car on the weekends.
Pannier bags. I did this for years. Before I got panniers I filled a big camping rucksack and cycled, but I wouldn't recommend that. Use a small backpack in addition to panniers if you have to, but having just the panniers feels the best.
However, in terms of safety you are unfortunately right. I didn't have a car so I went everywhere by bike but I was essentially a third class citizen in many places. Felt like I could just get wiped out and nobody would even care. There were no people around, only cars. I hate cars, so I had to get a car too :(
At this point, I treat rideshare like public transit: I assume I'm being watched, but I get to skip the permanent always-on tracking for the other 99% of the time that I'm not in the car.
Also, if you own a car, the state knows where you're going and when, per ALPR systems. With Uber or Lyft or a robotaxi, there's a layer between my personal information and the state. It's not an insurmountable layer, as rideshare / robotaxi services can always be subpoena'd, but adding a layer of extra work for the state is a net gain to my privacy.
see: Android's recent transformation into a closed platform which no longer allows users to control devices they purchase. it's important to fight against trends like this loudly and vehemently while we still can.
The reason why projectors don't use a single rgb lcd (like monitors) to produce the color is the same why all sub 5000$ projectors use pixel shift to fake 4k resolution: Too much light is blocked by the lcd itself if the individual pixels become too small.
It has low latency, will do 1080p 240Hz, 4k (pixel shift) 60Hz and HDR. Can even do 3D content if you really want...
BenQ did include an Android TV stick in the box, but you can just not hook it up to the projector - problem solved.
Either I can do the stupid thing and connect my LG TV to the network, or through various means download the UHD content, and therefore have to manage it, especially the last watched position, or forego it.
Having ADHD, I never really watch to the end, and so rely so much on the saved position to resume.
It would be prohibitively costly to produce per-device renditions so instead there is one generic rendition produced for "all smart TVs" and another one for "UHD capable smart TVs".
Traditional TV manufacturers all work with the BBC to get their devices certified, which is a requirement for carrying the iPlayer app and comes with legal agreements that asset that a device _will_ be able to playback BBC content for as long as it's supported.
Because Apple like to Think Differently, they opted not to align with the entire rest of the TV industry in standardising on MPEG-DASH spec. They instead require all developers to stream video using the HLS protocol. As UHD content on iPlayer is geared exclusively for smart TVs, and all the other smart TVs support MPEG-DASH, the UHD workflow simply never evolved the ability to target Apple's TV devices.
2. setup a one time use wifi network with randomized SSID and password (hotspot from your phone works well)
3. connect your tv to it and update to latest software
4. delete the wifi config and reset that network (roll to new SSID and password)
5. connect an apple tv set top box and never use any of the tv features ever again
For now I spend the extra money for "digital display" TVs that are just dumb input for HDMI devices but I fear that someday that option will either disappear or fall significantly behind regular TVs in display technology.
Turning a HDMI device on wakes the TV and then it automatically selects that input. I've never been to the homescreen except by choice, and even then it is completely stock. Barebones, no ads - it has no internet to get any.
Someone is going to run in here talking about how smart TV's randomly connect themselves to wifi, which is absolutely nonsense.
HN things I guess.
The only change I had to make starting from a "standard" Linux UI is bumping the screen zoom level to 150%. This may vary depending on your TV size and how far your couch is from your TV.
Building the HTPC was very cheap, I just boughs a horizontal form-factor case, and used spare "donor" parts coming from our household PCs after upgrades.
[1][2]For comparison, the only streaming platform that had all apps I wanted was Apple TV, but that one doesn't have a browser.
On Windows, it used to be different, but lately I’ve observed the same—ex: Netflix seems to limit the streaming quality even with Edge.
No one offers actual fidelity on the streaming platforms. They consider cost to them to serve content and assume you don’t care enough to seek alternatives.
So it is not always the case that the UHD disk is better in all aspects.
But now I wonder why your aggressivity sounds so defensive.
Practically because lots of "open" wifi networks have captive portals that don't actually get you Internet access without further action, and legally because using random networks without user confirmation is rather dodgy.
But now I wonder why your aggressivity sounds so defensive.
It's an urban legend that people keep repeating, and nobody can ever point to a specific case of it happening. It would be extremely easy to demonstrate: set up an open network, take a new or factory-reset TV, and wait.
Getting rid of ads on the streaming stick and various streaming services is an interesting challenge though...
The AppleTV supports CEC and controls the power and the volume.
No nagging
I keep avoiding the upgrade to keep the possibility open. At some point they force upgrade your firmware.
I had an Apple TV as well, but I don't use it anymore. And I otherwise only use Apple devices. But the Apple TV I just never got warm with.
Then why mention the pitiful shit? That describes a LCD TV I had in 2004, one of the first.
> but some of them also have a built-in DVD player.
Well, that changes everything; I want one now, LOL ...
Unlike phones,
- if it should be air gapped then all you’d want is your HDMIs input and remote control to work.
- nice to have: ADCs/DACs for analog AV input and audio out and any antenna input if available.
- super nice to have: Bluetooth for passing audio out and maybe network (Ethernet, WiFi) stack if same.
But assuming the goal is airgapped. There are less security concerns in general, You just want the Android TV to be lightweight and fast and don’t care it’s “stuck” in specific version or use closed blobs.
One problem with that approach is that you'll lose access to DRM'd contents, so while the official Netflix/HBO/Prime apps will install on lineageos, their video quality will be terrible or they will refuse to work.
There are a bunch of Google TV variants (brands like TCL and Philips) that will let you turn on "basic TV mode" (https://support.google.com/googletv/answer/10408998?hl=en), disabling pretty much everything other than displaying content.
As for why the Chinese TVs don't have a dumb mode, I think it's because the Chinese market is full of devices crammed to the brim with smart features, so smart TVs are sort of expected these days.
I hope it is not yet important for me as I never allowed a TV access to my LAN/WLAN. But with smart devices using accessible open WLANs to transmit who knows.
[1] https://arxiv.org/abs/2409.06203 / https://arxiv.org/pdf/2409.06203
For budget-conscious setup: even older plasma/LCD displays that predate the "smart" era are increasingly available secondhand. Pair with a Raspberry Pi or similar and you get a system you actually own.
Sure, there will probably be some alternatives from independent/smaller manufacturers but they will inevitably be based on older tech and/or standards, come with serious tradeoffs and so on.
This plus all the notes below about how various apps won't stream 4k in various circumstances depending on platform or web browser just lend further credence to the idea that it's best to say fuck it and deploy a jellyfin instance and sail the high seas. Or at least rip blu rays.
I mean why would I pay all these streaming services for such subpar service?
Sometimes I wonder if the people recommending pihole actually tried it. You get much better value out of ublock, smarttube, and so on.
pi#1) My personal DNS resolver, which I manually configure on each device.
pi#2) The much less restrictive DNS resolver which my DHCP server automatically issues to all other network clients, including all phones and IoT [0]
Individual hosts can then manually configure their DNS to resolve to the local network router (or third-party DNS), which effectively bypasses both PiHoles (for that device, only).
[0] There is a method to use a firewall to capture all outbound DNS and force routing through PiHole (ifsense? I don't know), which may be necessary for hard-coded DNS-IPs. I do not know how to do this but it's not necessary on my network.
My concern is the framerate. Some of these TVs, even in the 1080p era, will turn a cinematic masterpiece into feeling like a cheap soap opera. I’m not even sure what to look for to avoid this issue. Limiting myself to maybe 48hz tvs?
That can be as simple as an Apple / Android TV, or more.
I did not give my TV network access. Works fine.
Fine that you need to run ads and maybe partner with someone to sell those ads, but 226 of them?
You can intercept those as long as they're not using DoH/DoT.
There's historical speculation that a smart TV could connect to an open wireless access point, or more realistically, that it refuses to operate without internet access, perhaps after a certain number of power on hours.
1. Spend money. AppleTV and the Nvidia Shield have the best hardware followed by high end Roku devices.
2. Use a computer. That’s a horrible experience.
Obligatory David Foster wallce just to add some gen x post structuralist nihilism
I destroyed them and threw them in a dumpster like that Ron Swanson gif.
All to say, little cellular modems and a small data plan are likely getting cheap enough it's worth being extra diligent about the devices we let into our homes. Probably not yet to the point of that being the case on a tv, but I could certainly see it getting to that point soon enough.
Turns out they track the aggregate of everyone’s brushing and if every employee brushes their teeth, the plan gets a discount.
”Lower rate based on group's participation in Beam Perks™ wellness program and a group aggregate Beam score of "A". Based on Beam® internal brushing and utilization data.”
Until people start abusing these "features", they will not go away.
The data plans on some embedded modems are quite different from consumer plans. They are specifically designed for customers who have a large number of devices but only need a small amount of bandwidth on each device.
These plans might have a very low fixed monthly cost but only include a small data allowance, say 100 KB/month. That's plenty for something like a blood pressure monitor that uploads your results to your doctor or insurance company.
If you are lucky that's a hard cap and the data plan cuts off for the rest of the month when you hit it.
If you are unlucky that plan includes additional data that is very expensive. I've heard numbers like $10 for each additional 100 KB.
I definitely recall reading news articles about people who have repurposed a SIM from some device and using it for their internet access, figuring that company would not notice, and using it to watch movies and download large files.
Then the company gets their bill from their wireless service provider, and it turns out that on the long list of line items showing the cost for each modem, a single say $35 000 item really stands out when all the others are $1.
If you are lucky the company merely asks you to pay that, and if you refuse they take you to civil court where you will lose. (That's what happened in the articles I remember reading, which is how they came to the public's attention).
If you unlucky what you did also falls under your jurisdiction's "theft of services" criminal law. Worse, the amount is likely above the maximum for misdemeanor theft of services so it would be felony theft of services.
Now maybe you mean the TV? That’s not what this particular thread is about.
This thread is about removing the SIM from a TV.
If I bought that TV in cash (or even credit card, sans subpoena) at a Best Buy and removed the SIM, how is any corporation identifying me?
And once the SIM connects near your house, what is preventing the phone company from telling TVManufacturer the rough location of the SIM, especially after that SIM is found to have used too much data?
Then use some commercially available ad database to figure out that the person typically near this location with these last four digits is 15155.
That's just a guess, but there is enough fingerprinting that they will know with pretty high certainty it is you. Whether all this is admissible in civil court, idk.
No law: reality and PCI standards prevent this. And of course, the manufacturer could get a subpoena after enough process. This also assumes the TV was purchased with a credit card and not cash.
> And once the SIM connects near your house
> what is preventing the phone company from telling
Again: reality and the fact that corporations aren't cooperative. A rough location doesn't help identify someone in any urban environment. Corporations are not the FBI or FCC on a fox hunt.
Can you cite a single case where this has happened on behalf of a corporation? These are public record, of course.
https://wonderfulengineering.com/rtx-5080-buyer-opens-box-to...
I know I'm sure never shopping there again.
https://about.att.com/blogs/2025/5g-redcap.html https://www.t-mobile.com/news/network/5g-redcap-powering-sma...
Wouldn’t surprise me to see modems and eSIMs and embedded PCB antennas some day down the line.
> Dumb TVs sold today have serious image and sound quality tradeoffs, simply because companies don’t make dumb versions of their high-end models. On the image side, you can expect lower resolutions, sizes, and brightness levels and poorer viewing angles. You also won’t find premium panel technologies like OLED. If you want premium image quality or sound, you’re better off using a smart TV offline. Dumb TVs also usually have shorter (one-year) warranties.
Not to mention disabling the smart/ad features is an option on some smart tvs (ie. Sony).