The US, wow… what a place to live in as the 99%.
You know if you want to do something more effective than just putting copyright at the end of your comments you could try creating an adversarial suffix using this technique. It makes any LLM reading your comment begin its response with any specific output you specify (such as outing itself as a language model or calling itself a chicken).
It gives you the code necessary to be able to create it.
There are also other data poisoning techniques you could use just to make your data worthless to the AI but this is the one I thought would be the most funny if any LLMs were lurking on lemmy (I have already seen a few).
Thanks for the link. This was a good read.
deleted by creator
To turn every comment, no matter how on topic, into obnoxious spam.
From what I understand it is some thing for AI, to stop them from harvesting or to poison the data, by having it repeating therefore more likely to show up.
deleted by creator
Sounds an awful lot like that thing boomers used to do on Facebook where they would post a message on their wall rescinding Facebook’s rights to the content they post there. I’m sure it’s equally effective.
Sure, the fun begins when it starts spitting out copyright notices
That would require a significant number of people to be doing it, to ‘poison’ the input pool, as it were.
I would be extremely extremely surprised if the AI model did anything different with “this comment is protected by CC license so I don’t have the legal right to it” as compared with its normal “this comment is copyright by its owner so I don’t have the legal right to it hahaha sike snork snork snork I absorb” processing mode.
No but if they forget to strip those before training the models, it’s gonna start spitting out licenses everywhere, making it annoying for AI companies.
It’s so easily fixed with a simple regex though, it’s not that useful. But poisoning the data is theoretically possible.
Only if enough people were doing this to constitute an algorithmically-reducible behavior.
If you could get everyone who mentions a specific word or subject to put a CC license in their comment, then an ML model trained on those comments would likely output the license name when that subject was mentioned, but they don’t just randomly insert strings they’ve seen, without context.
That seems stupid
Is this really about technology? Sounds like it’s really about American renter rights.
Yeah, it’s really more about two massive industries colluding to extract additional income from working Americans. Rental agencies contract with Spectrum, get a cut off the top, and the renters are stuck with a shitty internet service they don’t want. Honestly, renting has never been a great experience for the average American, but it’s been getting worse over time. Rental agencies are starting to cut staff, reduce actual beneficial services offered, force renters into paying for additional junk services they don’t want or need (what the fuck is a $50 a month “beautification fee,” anyway? Nobody ever fucking cleans this place…), and, of course, increase rent every year. And they can do this because…what the fuck else are you going to do? If you’re working class and live in a high cost of living area, you can’t just move, or buy a house. You have to rent. No other options, really. And while you’d think “well, if someone else opens an apartment complex that offers better services, you can just move there.” Sure, and spend 15 grand moving a mile and a half only to have the apartment complex you moved to suffer the same enshittification after 6 months that the first one did.
I’m actually shocked at how small an amount of people have t-mobiles. It works fantastic and never drops in my area, which is a whole lot better than the cable net I had. My phones are t-mo so the internet (its a gateway they give you, so modem/wifi in one) is $30 a month with no taxes or bs. Straight $30. I think it’s $50 if you aren’t a t-mo cell customer.
i play games online, and wireless is prone to jitter and lag spikes.
you don’t notice these things when browsing the web, streaming movies, or even downloading large games. but in multiplayer games it’s a problem
i have gigabit fiber in my neighborhood though, so i’m not being forced to choose between shitty cable and compromised wireless
I also game online and have no lag or jitter(unless it’s server side and everyone is complaining). Like I said before. I have good ping and zero packet loss. Sounds like you had a bad wifi set up.
Average ping isn’t really the problem with wireless, it’s packet loss. But my concern wasn’t WiFi, which has gotten pretty good, though still prone to issues with certain home designs and building materials. My concern was cellular networks. 5G reception at my house with two different major carriers (AT&T and T-Mobile) is just OK at best, and I measure plenty of packet loss and lag spikes. It’s not a problem for my phone, but I would find that unacceptable for my home internet.
I don’t think we will ever reach a point where wireless technologies are as good as a hard connections. All the neat tricks we use to eek more bandwidth out of wireless spectrum like time division multiple access are equally applicable to both copper and fiber optic lines. And those copper and fiber optic lines have the benefits of having much more spectrum available to use, not having to share spectrum with nearly as many devices, and not having usable spectrum limited by line-of-sight. They also benefit from not needing to share nearly as many clients over the same medium, since each individual wire is it’s own medium, rather than sharing the same RF medium as every other wireless device in your locale.
There is no packet loss on mine. If I ping 20 packets, I get 20 packets. 100%
20 packets is a very small sample size.
ping
also won’t necessarily capture all lost packets over wifi. Many are lost and re-transmitted by the wifi hardware without anything higher in the stack being aware.Look, man. Keep trying to spin things as hard as you can, but my wifi doesn’t lose packets, and “higher than the stack” hiding dropped packets is pure baloney, since that would still show a substantial increase in ping time. Stop trying to make yourself feel vindicated for buying expensive internet.
Define “good” ping. (Latency is the proper term)
Edit: Nvm, just saw your other comment. 50ms isn’t bad.
30ms+ is high for cable in my experience. I was getting routinely in the high teens and low 20s.
On fiber I get less than 10ms.
That’s all the way through the gateway using its wifi, too. I’m sure if I plugged in the ethernet cable and skipped the wifi it would shave off like 10ms.
Can’t beat it for just $30 a month.
Latency. Also, wired is always better than wireless. I’ll save the long boring explanation for another time, but suffice it to say that wireless constantly has dropped packets, and constantly has to retransmit data.
Wired when you can, wireless when you have to.
Not by much. My average ping on cable was around 30ms with no packet loss. On t-mo 5g it’s usually around 50ms with no packet loss.
Fifty is still a good ping. Even for fps gaming. Stuff doesn’t get dicey until you’ve gone over 80. As further, I’ve had no gaming issues at all with it.
An added 20ms is pretty noticeable in a video game. That’s more than one whole extra frame in a game running at 60 fps. Liberal use of client-side prediction means it won’t feel the same though, and instead of manifesting as delayed input response, you get more instances of being shot around corners and hits not registering.
But the bigger problem is packet loss, which leads to occasional lag spikes. Just like with frame rates, the average latency isn’t the whole story. Those 1% lows are just as important to ensuring a smooth and consistent experience.
I’ve stated to someone else, but there isn’t any packet loss. I can cmd line ipconfig a 20x ping to a server and not lose a single packet.
Also, losing a single frame is nothing. You aren’t getting shot when you wouldn’t have over 20ms.
Online shooters are always a no win situation anyhow, unless you happen to be one of the top 200 players of that game in your region. Outside of that all the games place you with a bunch of similar stat players. You don’t play with all random people. You get grouped up with people like you, so you never really get to even know if you’re “one of the best” players or if you’re worse than most. You either play them to be extremely competitive and you’re one of a handful of players good enough to actually be one of the best, or you’re just playing for fun. If you’re just playing for fun then 20ms is really, really, not important.
20 packets is a very small sample size.
ping
also won’t necessarily capture all lost packets over wifi. Many are lost and re-transmitted by the wifi hardware without anything higher in the stack being aware.Online shooters are always a no win situation anyhow, unless you happen to be one of the top 200 players of that game in your region. Outside of that all the games place you with a bunch of similar stat players. You don’t play with all random people. You get grouped up with people like you, so you never really get to even know if you’re “one of the best” players or if you’re worse than most. You either play them to be extremely competitive and you’re one of a handful of players good enough to actually be one of the best, or you’re just playing for fun. If you’re just playing for fun then 20ms is really, really, not important.
This is just not true. I play online shooters pretty casually, but I’ve been playing them regularly since 2001. When my ping time in Overwatch or Apex goes from the usual 35 to 55-60, it feels pretty noticeable in-game. Even though I’m nowhere near top 500. If you don’t notice the difference, that is great, but it doesn’t mean everyone else has the same experience.
T-mos general coverage outside of city centers and interstates is trash (they’re all pretty bad, but Tmo is very binary). I’d get it over xfinity, but it’s not even offered in my major university town due to coverage limitations. And it’s not like there aren’t big pipes nearby - the university consumes more than 100TB of data traffic a day; their Netflix traffic alone was so large just 3 years ago that they were on the edge of getting a co-located Netflix rack on campus.
I get you for your area, but that’s not the case in my state. Also, t mobile has the largest 5g coverage area nationwide by a large margin. Like, not even close. Area wise Verizon and at&t combined still don’t match it.
Well, you’re the one who said you’re shocked at the small numbers of Tmo customers. It may be a shock in your area if they have good coverage, but in my state they are trash. I have TMo and lose signal anywhere outside a city center. I visit my verrrrry rural parents and get zero signal in a 30 mile radius around their house until I get there and connect to their wifi … powered by an att-connected 4g router.
Like I said, that’s your area(and thats 4g from the att. Not the much faster 5g). Doesn’t change that you can look up coverage data from any source you can find. 5g coverage is completely dominated by t-mobile for nation wide coverage right now.
Now cell coverage for 3/4g and just keeping cell signal; Verizon all day.
Yes, my state is far larger than yours, so that may be a difference. We only have 5G coverage in major cities and along interstates.
Did I even mention what stayed I was at in this thread?
i think sometimes… in certain outskirt or rural areas there may be cases when a slow and steady provider (verizon/visible) may be considered a safer option for some.
Safer? It’s like half the monthly price, has a ping/latency of around 50, and speeds over 250 mbps with almost no downtime. It’s just been a good option.
in rural areas? in all rural areas??
some people have suggested that verizon has more coverage in rural areas but less speed overall.
there are a few exceptions for both providers.
deleted by creator