Because they were making Reddit 2.0, with all the same flaws.
Because they were making Reddit 2.0, with all the same flaws.
Excellent, thankyou! I was just going to throw ubuntu at it unless I really needed something else because of the potato specs, so hopefully drivers are already sorted.
especially the WiFi/Bluetooth chipset
Noted. I would be pissed to not have that working.
Don’t try anything fancy
No chance, I’ve been burnt by my unix arrogance enough times to not want to try it on proprietary hardware. Until now I assumed even getting Linux on there was too fancy, I still remember other people fighting for weeks with their hackintosh a decade ago.
Thanks, I appreciate it! I’m happy for a dual boot and for no camera, so here’s hoping for the rest.
I have a late 2011 that I might be interested in doing this to. Any practical advice on avoiding your suffering?
I appreciate you. 🙏 I have been considering looking into hardening my home network, but I dreaded the idea of figuring out which tools weren’t just sponsored SEO-optimising AI-generated time-wasting network-snooping bullshit. This gives me somewhere to start.
Marketers gotta market something, I guess.
People have already mentioned testing and abstraction, but what about other developers and security?
Spaghetti code all you like in solo projects. But if someone else is coming along to debug a problem in their toppings, why would you make them remember anything about baking or the box when it’s completely irrelevant?
And why should the Box object be able to access anything about the Oven’s functionality or properties? Enjoy your oven fire and spam orders when someone works out they can trigger the bake function or access an Order’s payment details from a security hole in the Box object implementation.
It’s not just about readability as a narrative, even if that feels intuitive. It’s also about memory management, collaboration and security.
Enough people that the dude is still actively supporting and updating it for new OS versions. The RAR is unkillable.
Not necessarily. For example, I know RAR is a bit out of style, but WinRAR just this week had some articles about malware lurking in otherwise non-executable files
There is no such thing as ‘safe’ user-generated content, only a spectrum of more or less safe content.
We used post-it notes on a wall at a previous workplace to aid a truly useless manager. It didn’t make him a better manager, but it did have upsides. It felt great to crunch completed tasks up into little balls and throw them in the recycling when we did standups. The extra visibility in the room was really helpful too, other colleagues would ask us about our work or when we might be free for their whims, and we could just point at the wall and say “after all that shit is done?”. Usually they would see the mountain in the to-do columns and say “oh.” and then walk off dejectedly. It stopped a lot of bullshit requests with the mere presence of colourful papers fluttering in the aircon, including incompetent managerial scope creep.
The fridge would work well for this with some little magnets and/or a whiteboard marker, like people do with reward charts for kids.
Even as someone who declines all cookies where possible on every site, I have to ask. How do you think they are going to be able to improve their language based services without using language learning models or other algorithmic evaluation of user data?
I get that the combo of AI and privacy have huge consequences, and that grammarly’s opt-out limits are genuinely shit. But it seems like everyone is so scared of the concept of AI that we’re harming research on tools that can help us while the tools which hurt us are developed with no consequence, because they don’t bother with any transparency or announcement.
Not that I’m any fan of grammarly, I don’t use it. I think that might be self-evident though.
How will we be making WASM-based UI accessible for people using screen readers, screen zoom applications, text to speech and voice input users, etc.?
The Web is hostile enough to people with disabilities, despite its intent, and developers are already unfamiliar with how to make proper semantic and accessible websites which use JS. Throwing the baby out with the bathwater by replacing everything with WASM in its current form seems about as good an idea as Google’s Web Environment Integrity proposal.
In the meantime, they’re going to shout at it.
It should be ok. It’s due to self-reset its orientation on October 15, they put measures in place for if they accidentally lost contact.
I would still be besides myself if I had made that error though.
Honestly, I’m not sure. Privacy has always been a spectrum, but we’re now living in a world where it’s near impossible to get close to 100% privacy for any action from the start. I suspect the current possible remedies are “ensuring the people and organisations which use/abuse surveillance are heavily regulated and compliance heavily enforced” which ironically requires transparency.
Realistically there needs to be lengthy legal procedures to grant authorities and companies use of such techniques. Legislation like that is complicated and slow to develop though. It also risks pinning the core privacy concepts to specific versions of specific tech, which complicates its enforcement over time.
Even if it is very illegal to do this to someone though, there will always be people who use it for whatever purposes. Obviously making it illegal under wiretapping laws without explicit opt-in consent to do it is something that would need to happen. I’d also like to see mandatory source attribution laws.
That won’t stop everyone though. Which means we maybe need to start looking into comstruction legislation to ensure RF blocking materials are used in external wall construction. If that is an effective remedy to Van Eck phreaking at all. I have no idea what resolution information can be determined from devices that aren’t purpose built broadcasting and receiving devices.
And all of that requires good-will and sensible decisions from the existing legal systems and legislators. Which can’t be completely achieved, and in many cases is… currently very poor.
Tl;dr A very hard problem which will need work from a bunch of different parts of society and likely cannot be completely solved for all people. The only solution for this specific technique right now I think is to go fully off-grid with no electricity. Even then though you’ll still have satellites and drones to intrude.
The fact they’re able to do this is no surprise to me. The fact they’re able to do this on very easily accessible equipment to that degree of accuracy is scary impressive.
While this obviously has huge consequences for privacy, the part that concerns me most is its usage in development of deep fakes. I worry about the consequences of no longer being able to distinguish real video evidence from deliberate manipulation.
Bug: violation of w3c code of ethics issue was opened 15 hours ago.
It was closed 14 hours ago with the status “completed” without further comment.
The guy who closed it posted an entry a day earlier called “So, you don’t like a web platform proposal” on his rarely used blog. It has the appearance of telling people how to critique proposals in a professional way, while being completely dismissive of any communication attempts simultaneously. Perhaps he needs to reflect a little more on his blog entry’s subheading “We’re all humans”, because he doesn’t seem keen to address how users who rely on Assistive Technology are going to be able to use his DRM Web.
Edited to add: The code of ethics is for people who work at the W3c, so not entirely applicable anyway.
I am interested in learning more about this. I know a fair bit about networks but exploit history and modern attack / defense strategies and server hardening are not my main specialty. Do you have any good links or resources that you can share?