• 30 Posts
  • 2.02K Comments
Joined 1 year ago
cake
Cake day: October 4th, 2023

help-circle
  • Facts are not copyrightable, just their presentation. So I don’t think that it’s possible to say that it’s impossible to summarize material. A court is going to say that some form of summary is legal.

    On the other hand, simply taking material and passing it through an AI and producing the same material as the source — which would be an extreme case — is definitely copyright infringement. So there’s no way that a court is going to just say that any output from an AI is legal.

    We already have criteria for what’s infringing, whether a work is “derivative” or not.

    My bet is that a court is going to tell Brave “no”, and that it’s up to Brave to make sure that any given work it produces isn’t derivative, using existing case law. Like, that’s a pain for AI summary generators, but it kind of comes with the field.

    Maybe it’s possible to ask a court for clearer and harder criteria for what makes a work derivative or not, if we expect to be bumping up against the line, but my guess is that summary generators aren’t very impacted by this compared to most AI and non-AI uses. If the criteria get shifted to be a little bit more permissive (“you can have six consecutive words identical to the source material”, say) or less permissive (“you can have three consecutive words identical to the source material”), my guess is that it’s relatively easy for summary generators to update and change their behavior, since I doubt that people are keeping these summaries around.


  • If you look at the article, it was only ever possible to do local processing with certain devices and only in English. I assume that those are the ones with enough compute capacity to do local processing, which probably made them cost more, and that the hardware probably isn’t capable of running whatever models Amazon’s running remotely.

    I think that there’s a broader problem than Amazon and voice recognition for people who want self-hosted stuff. That is, throwing loads of parallel hardware at something isn’t cheap. It’s worse if you stick it on every device. Companies — even aside from not wanting someone to pirate their model running on the device — are going to have a hard time selling devices with big, costly, power-hungry parallel compute processors.

    What they can take advantage of is that for a lot of tasks, the compute demand is only intermittent. So if you buy a parallel compute card, the cost can be spread over many users.

    I have a fancy GPU that I got to run LLM stuff that ran about $1000. Say I’m doing AI image generation with it 3% of the time. It’d be possible to do that compute on a shared system off in the Internet, and my actual hardware costs would be about $33. That’s a heckofa big improvement.

    And the situation that they’re dealing with is even larger, since there might be multiple devices in a household that want to do parallel-compute-requiring tasks. So now you’re talking about maybe $1k in hardware for each of them, not to mention the supporting hardware like a beefy power supply.

    This isn’t specific to Amazon. Like, this is true of all devices that want to take advantage of heavyweight parallel compute.

    I think that one thing that it might be worth considering for the self-hosted world is the creation of a hardened network parallel compute node that exposes its services over the network. So, in a scenario like that, you would have one (well, or more, but could just have one) device that provides generic parallel compute services. Then your smaller, weaker, lower-power devices — phones, Alexa-type speakers, whatever — make use of it over your network, using a generic API. There are some issues that come with this. It needs to be hardened, can’t leak information from one device to another. Some tasks require storing a lot of state — like, AI image generation requires uploading a large model, and you want to cache that. If you have, say, two parallel compute cards/servers, you want to use them intelligently, keep the model loaded on one of them insofar as is reasonable, to avoid needing to reload it. Some devices are very latency-sensitive — like voice recognition — and some, like image generation, are amenable to batch use, so some kind of priority system is probably warranted. So there are some technical problems to solve.

    But otherwise, the only real option for heavy parallel compute is going to be sending your data out to the cloud. And even if you don’t care about the privacy implications or the possibility of a company going under, as I saw some home automation person once point out, you don’t want your light switches to stop working just because your Internet connection is out.

    Having per-household self-hosted parallel compute on one node is still probably more-costly than sharing parallel compute among users. But it’s cheaper than putting parallel compute on every device.

    Linux has some highly-isolated computing environments like seccomp that might be appropriate for implementing the compute portion of such a server, though I don’t know whether it’s too-restrictive to permit running parallel compute tasks.

    In such a scenario, you’d have a “household parallel compute server”, in much the way that one might have a “household music player” hooked up to a house-wide speaker system running something like mpd or a “household media server” providing storage of media, or suchlike.



  • The nerds lost the internet.

    I mean, there wasn’t a shift in control or anything. This is just part of the business plan.

    Reddit, like many B2C online services, intentionally operated at a loss for years in order to grow.

    1. Get capital.

    2. Spend capital providing a service that is as appealing as possible, even if you have to lose money to do it. This builds your userbase. This is especially important with services that experience network effect, like social media, since the value of the network rises with the square of the number of users. This is the “growth phase” of the company.

    3. At some point, either capital becomes unavailable, too expensive (e.g. in the interest rate hikes after COVID-19), or you saturate available markets. At that point, you shift into the “monetization phrase” – you have to generate a return using that userbase you built. Could be ads, charging for the service or some premium features, harvesting data, whatever. Because interest rates shot up after COVID-19, a lot of Internet service companies were forced to rapidly transition into their monetization phase at the same time. But point is, your concern isn’t growing the service as much as it is making a return then, and it’s virtually certain that in some way, the service will become less-desirable, since the service is shifting to having a priority on making a return above being desirable to draw new users. That transition from growth to monetization phase is what Cory Doctrow called “enshittification”, though some people around here kind of misuse the term to refer to any change that they don’t like.

    Investors were not going to simply shovel money into Reddit forever with no return — they always did so expecting some kind of return, even if it took a long time to build to that return. I hoped that that changes when they moved into a monetization phase were changes that I could live with. In the end, they weren’t — I wasn’t willing to give up third party clients, if there was an alternative. But it’s possible that they could have come up with some sort of monetization that I was okay with.

    If you don’t mean the transition from growth to monetization at Reddit, but the creation of Reddit at all…

    The main predecessor to Reddit was, I suppose, Usenet. That was a federated system, and while it wasn’t grown with that kind of business model, it wasn’t free — but typically it was a service that was bundled into the bill when one got service from an ISP, along with email and sometimes a small amount of webhosting. Over time, ISPs that provided bundled Usenet service stopped providing it (since it increased their subscription fees and made their actual Internet service uncompetitive for people who didn’t use Usenet service), and because so many people used it to pirate large binaries, the costs of running a full-feed Usenet server increased. Users today that use Usenet typically pay a subscription to some commercial service. You can still get Usenet service now, if that’s what you want – but you’ll pay for it a la carte rather than having it bundled, and the last time I was trying to use it for actual discussion, it had real problems with spam.


  • I use gdb myself.

    I don’t know exactly what you’re after. From the above, I see:

    “easy to use”

    " the mouse is faster, not slower"

    You don’t specify a language, so I’m assuming you’re looking for something low-level.

    You don’t specify an editor, so I’m assuming that you want something stand-alone, not integrated with an editor.

    There are a number of packages that use gdb internally, but put some kind of visualization on it. I’ve used emacs’s before, though I’m not particularly married to it — mainly found it interesting as a way to rapidly move up and down frames in a stack — but I’m assuming that if you want something quick to learn, you’re not looking for emacs either.

    Maybe seer? That’d be a stand-alone frontend on gdb with a GUI. Haven’t used it myself.

    EDIT: WRT gdb, the major alternative that I can think of to gdb is dbx, and that’s also a CLI tool and looks dead these days. gdb is pretty dominant, so if you want something mouse-oriented, you’re probably going to have some form of frontend on gdb.

    There are other important debugging tools out there, stuff like valgrind, but in terms of a tool to halt and step through a program, view variables, etc, you’re most-likely looking at gdb, one way or another, unless you’re working in some sort of high-level language that has its own debugger. If you want a GUI interface, it’s probably going to be some sort of frontend to gdb.

    EDIT2: Huh. Apparently llvm has its own debugger, lldb. Haven’t used it, and it’s probably not what you want anyway, since it’s also a CLI-based debugger. I am also sure that it has far fewer users than gdb. But just for completeness…guess you already looked at that, mentioned it in your comment.


  • tal@lemmy.todaytoGames@sh.itjust.worksSteam is a ticking time bomb.
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    edit-2
    5 days ago

    If you want a preview of an uncaring and anti-consumer Valve, look no further than the company’s efforts on Mac.

    Valve never updated any of its earlier games to run in 64-bit mode, because the underlying Source engine was 32-bit across both Windows and Mac (with the exception of CS:GO). Apple dropped support for 32-bit applications in 2019, with the release of macOS 10.15, making all of those games inaccessible on newer Mac hardware.

    I think that this one is on Apple, not Valve. Windows maintained 32-bit compatibility. Linux maintained 32-bit compatibility. Apple could have maintained 32-bit compatibility.

    Steam for Mac no longer exists to sell Valve’s own games, and it has visibly suffered as a result. Steam is still not updated to run natively on Apple Silicon-based Mac computers, nearly four years after Apple’s transition away from Intel CPUs started. It’s now a slow and clunky barrier to playing the games I own on my Mac computers—a far cry from the pro-consumer persona that Valve and Steam usually enjoy.

    Ditto about this being on Apple — there’s no ARM-native Steam package for Linux, nor for Windows.

    Valve isn’t obligated to continue supporting all its games and software features on Mac, especially when Apple’s reluctance to natively support Vulkan and other cross-platform technologies makes game development more complex. There’s no excuse for Steam on Mac to be a far worse experience than on other platforms, though.

    The stuff you are asking for is areas where Apple made changes that created problems for application software vendors that weren’t created by Microsoft on Windows and weren’t created by Linux distros, and where you’re upset with Valve for not patching over platform issues. There’s nothing specific to Steam about this.

    EDIT: I do wonder, if there’s enough interest, whether someone could make an x86 accelerator card for current Macs. Back in the day, I remember that Orange Micro made one for emulating Windows software. Not cheap, but you basically had a Mac with the guts of an x86 PC added, and you could run x86 software at full speed. I’d imagine that you could basically do the same…just for older Mac software. Today, computers are a lot cheaper than they were back then.

    kagis

    Here are some Mac users talking about those, with a full history of Mac x86 accelerator cards. It doesn’t look like there’s been any hardware vendor try to recently make one, though.

    Probably need to be a USB device too, given the number of people on laptops these days.

    EDIT2: Plus, be nice if it could run x86 Windows software natively as well.




  • “Where to find the time of day changes depending on what [driving] mode you’re in,” he said. “The buttons that go through your six favorite channels don’t work if it’s satellite radio channels. It takes so many tries to hit one button in your jiggly car, and it just doesn’t work.”

    Well, Woz. You’re famous for doing a universal control panel for another prominent piece of consumer electronics and figuring out how to interface it to lots of different brands.

    https://en.wikipedia.org/wiki/Universal_remote

    In 1987, the first programmable universal remote control was released. It was called the “CORE” and was created by CL 9, a startup founded by Steve Wozniak, the inventor of the Apple I and Apple II computers.[2]

    All you had to do then was to reverse-engineer the infrared protocols used to communicate with the televisions.

    I bet that it’s probably possible to figure out a way to have a third-party control panel interface with various auto UIs. Like, build a universal interface, and then just design mounting hardware on a per-car basis? Use Android Auto or CarPlay, OBD-II, and such?

    Can Android Auto do climate control?

    kagis

    Sounds like it doesn’t, but may start being able to do so:

    https://www.androidauthority.com/android-auto-climate-controls-3533161/

    Android Auto could be about to turn up the heat (and AC) on car comfort

    Climate control may finally be coming to Google’s in-car interface.

    Android phones don’t have physical buttons for car features. But…that’s not a physical limitation. Just is a result of reusing a phone as a car panel.

    So instead of having third-party car computers being the province of a few hobbyist hardware hackers, there’s an out-of-box solution for everyone? Make the “Wozpanel” or whatever that I just mount in my car? Stick physical buttons on it? Maybe have a case and faceplate that wraps it to match interiors?




  • The overall goal is to cut the agency’s budget by fifty percent. Shedd suggested using AI to analyze contracts for redundancies, root out fraud, and facilitate a reduction in the federal workforce by automating much of their work.

    I am bullish on AI in the long run.

    I am skeptical that given the state of affairs in 2025, you can reasonably automate half of the federal government, via AI or any other means.

    I also don’t think that the way to do this is to lay off half of the federal workforce and then, after the fact, see what can be automated. If you look at the private sector automating things, it tends to hedge its bets. Take self-service point-of-sale kiosks. We didn’t just see companies simply lay off all cashiers. Instead, we saw them brought in as an option, then had the company look at what worked and what didn’t work – and some of those were really bad at first – and then increase the rate of deployment once it had confidence in the solution and a handle on the issues that came with them.


  • Armed with this new tool, which enables raw access to Bluetooth traffic, Targolic discovered hidden vendor-specific commands (Opcode 0x3F) in the ESP32 Bluetooth firmware that allow low-level control over Bluetooth functions.

    In total, they found 29 undocumented commands, collectively characterized as a “backdoor,” that could be used for memory manipulation (read/write RAM and Flash), MAC address spoofing (device impersonation), and LMP/LLCP packet injection.

    Espressif has not publicly documented these commands, so either they weren’t meant to be accessible, or they were left in by mistake.

    I’d kind of like to know whether these can be used against an unpaired device or not. That’d seem to have a pretty dramatic impact on the scope of the vulnerability.


  • tal@lemmy.todaytoTechnology@lemmy.worldWe all deserve better than this
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    2
    ·
    edit-2
    8 days ago

    I’ve been telling myself since about 2016 that I would save up to go all in and build a solid gaming desktop.

    Finally, I was at the point of “Fuck it, I’m tired of waiting. I’m buying a 5080, even if it costs as much as 2 PS5s.”

    I assume that whatever you’re running right now isn’t terribly new if you’ve been thinking about upgrading for nine years.

    The 5080 is a 16GB card. A quick skim on Amazon suggests that 16GB Nvidia cards are in short supply, but that you can get a 16GB AMD GPU without problems.

    https://www.videocardbenchmark.net/compare/4982vs5721vs4917/Radeon-RX-7600-XT-vs-GeForce-RTX-5080-vs-Radeon-RX-7800-XT

    They aren’t quite as fast on the Passmark benchmark as the 5080, but they also cost a lot less (even if the 5080 were available), and I assume that they’d be a lot faster than whatever you’re running now.

    Could go with that (or something less-fancy) and then if you felt that you wanted to spend more for more performance, do so when GPUs become available.



  • I was reading some articles the other day, and the impression I have is that that’s really not true for at least Trump.

    The Trump route was more:

    • Conservatives in the US felt that media had a liberal bias. Whether it did or didn’t doesn’t matter for this discussion — that was the perception.

    • Fox News offers a viewpoint appealing to conservatives. It becomes essentially the only mainstream conservative media outlet. Liberal viewers watch a variety of news media, but Fox News dominates among conservatives.

    • Fox News — already somewhat opinion-based from the start — starts to veer off into conspiracy land. Because so many conservatives watch Fox News, this has a major impact.

    There’s some back and forth here. It’s not that Fox just pushed ideas that were out there, but that they’re willing to show material based on what people will watch, and they gained more viewers than they lost if they ran bonkers stuff.

    https://www.cnn.com/2021/06/08/media/fox-news-hoax-paperback-book/index.html

    Section

    When Donald Trump lost the presidency last November, Fox News lost too. But unlike Trump, Fox was never in denial about its loss. The network’s executives and multi-million-dollar stars stared the ratings in the face every day and saw that their pro-Trump audience was reacting to the prospect of President Biden by switching channels or turning off the TV.

    “We’re bleeding eyeballs,” a Fox producer remarked in December. “And we’re scared.”

    To fix the problem, Fox ran even further to the right. And here’s the thing: It worked. It was toxic for the American political system, but it was profitable for Rupert and Lachlan Murdoch.

    “Fox is a really different place than it was pre-election,” a commentator said to me, with regret, after Biden took office.

    The post-election changes at Fox happened one day at a time, one show at a time, but when viewed in totality, they are unmistakable and stark. Practically every change was about having less news on the air and more opinions-about-the-news. It was like serving dessert without dinner, when the dessert consisted of screaming about how awful the dinner was, and warning that the meal might be a socialist plot, and hey, while we’re at it, why are chefs so corrupt?

    And because Fox News is the primary trusted source of information for millions of Americans, including Republican elected officials and party activists, the changes affect everyone.

    Trump’s loss was a pivot point.

    ‘We denied the pandemic and now we’re denying the election outcome.’

    Fox’s ratings declined in the immediate aftermath of Mitt Romney’s loss in 2012, so the slump after the networks projected Biden as president-elect was no surprise. But the precipitousness was a shock. Fox’s afternoon and evening hours fell off by 20, 25, 30 percent, even though the news cycle was nothing short of epic. For people at Fox who were used to winning for years, this was disorienting, and for some downright terrifying.

    “Our audience hates this,” one executive said to me in a moment of candor. “This” was Biden as president-elect and Kamala Harris as VP-elect. “They’re pissed,” said a second source. “Seething,” said another.

    I granted anonymity to these sources because they weren’t allowed to speak with outside reporters on the record, and because I wanted them to freely offer blunt assessments of the situation.

    Fox’s problem was that the audience suddenly had somewhere else to go. On the up-and- coming channel Newsmax, Biden wasn’t called president-elect right away. In other words, Trump wasn’t a loser yet. Newsmax’s 7 p.m. host Greg Kelly kept saying that he believed Trump could stay in office for four more years. “IT ISN’T OVER YET,” Newsmax’s banners proclaimed. While Fox only dabbled in election denialism at first, Newsmax went all-in.

    There wasn’t really any major center-right mainstream news source other than Fox News, so if Fox shifts into conspiracy-land, so does the conservative public.

    I dunno. Maybe the answer is something like a news source somewhere between CNN and Fox News. Something that a conservative audience is comfortable watching, but doesn’t fly off the handle to the degree that Fox has. It maybe can’t capture an audience that’s as large, but it only needs enough to be viable.

    I mean, there are center-right media sources like the Wall Street Journal, but those are kinda not aimed at mass audiences.