garote: (wasteland librarian)
[personal profile] garote posting in [community profile] talkpolitics
Late last year I wrote this. Since it's on-topic, I'd like to see what everyone here thinks...

Search engines used to take in a question and then direct the user to some external data source most relevant to the answer.

Generative AI in speech, text, and images is a way of ingesting large amounts of information specific to a domain and then regurgitating synthesized answers to questions posed about that information.  This is basically the next evolutionary step of a search engine.  The main difference is, the answer is provided by an in-house synthesis of the external data, rather than a simple redirect to the external data.

This is being implemented right now on the Google search page, for example.  Calling it a search page is now inaccurate.  Google vacuums up information from millions of websites, then regurgitates an answer to your query directly.  You never perform a search.  You never visit any of the websites the information was derived from.  You are never aware of them, except in the case where Google is paid to advertise one to you.

If all those other pages didn’t exist, Google's generative AI answer would be useless trash.  But those pages exist, and Google has absorbed them.  In return, Google gives them ... absolutely nothing, but still manages to stand between you and them, redirecting you to somewhere else, or ideally, keeping you on Google permanently.  It's convenient for you, profitable for Google, and slow starvation for every provider of content or information on the internet.  Since its beginning as a search engine, Google has gone from middleman, to broker, to consultant.  Instead of skimming some profit in a transaction between you and someone else, Google now does the entire transaction, and pockets the whole amount.

Reproducing another's work without compensation is already illegal, and has been for a long time.  The only way this new process stays legal is if the work it ingests is sufficiently large or diluted enough that the regurgitated output looks different enough (to a human) that it does not resemble a mere copy, but is an interpretation or reconstruction.  There is a threshold below which any reasonable author or editor would declare plagiarism, and human editors and authors have collectively learned that threshold for centuries.  Pass that threshold, and your generative output is no longer plagiarism. It's legally untouchable.

An entity could ingest every jazz performance given by Mavis Staples, then churn out a thousand albums "in the style" of Mavis Staples, and would owe Mavis Staples nothing, while at the same time reducing the value of her discography to almost nothing.  An entity could do the same for television shows, for novels - even non-fiction novels - even academic papers and scientific research - and owe the creators of these works nothing, even if they leveraged infinite regurgitated variations of the source material for their own purposes internally.  Ingestion and regurgitation by generative AI is, at its core, doing for information what the mafia needs to do with money to hide it from the law:  It is information laundering.

Imitation is the sincerest form of flattery, and there are often ways to leverage imitators of one's work to gain recognition or value for oneself. These all rely on the original author being able to participate in the same marketplace that the imitators are helping to grow. But what if the original author is shut out? What if the imitators have an incentive to pretend that the original author doesn't exist?

Obscuring the original source of any potential output is the essential new trait that generative AI brings to the table.  Wait, that needs better emphasis:  The WHOLE POINT of generative AI, as far as for-profit industry is concerned, is that it obscures original sources while still leveraging their content.  It is, at long last, a legal shortcut through the ethical problems of copyright infringement, licensing, plagiarism, and piracy -- for those sufficiently powerful enough already to wield it.  It is the Holy Grail for media giants.  Any entity that can buy enough computing power can now engage in an entirely legal version of exactly what private citizens, authors, musicians, professors, lawyers, etc. are discouraged or even prohibited from doing. ... A prohibition that all those individuals collectively rely on to make a living from their work.

The motivation to obscure is subtle, but real.  Any time an entity provides a clear reference to an individual external source, it is exposing itself to the need to reach some kind of legal or commercial or at the very least ethical negotiation with that source.  That's never in their financial interest.  Whether it's entertainment media, engineering plans, historical records, observational data, or even just a billion chat room conversations, there are licensing and privacy strings attached. But, launder all of that through a generative training set, and suddenly it's ... "Source material? What source material? There's no source material detectable in all these numbers. We dare you to prove otherwise." Perhaps you could hire a forensic investigator and a lawyer and subpoena their access logs, if they were dumb enough to keep any.

An obvious consequence of this is, to stay powerful or become more powerful in the information space, these entities must deliberately work towards the appearance of "originality" while at the same time absorbing external data, which means increasing the obscurity of their source material.  In other words, they must endorse and expand a realm of information where the provenance of any one fact, any measured number, any chain of reasoning that leads outside their doors, cannot be established.  The only exceptions allowable are those that do not threaten their profit stream, e.g. references to publicly available data.  For everything else, it's better if they are the authority, and if you see them as such.  If you want to push beyond the veil and examine their reasoning or references, you will get lost in a generative hall of mirrors. Ask an AI to explain how it reached some conclusion, and it will construct a plausible-looking response to your request, fresh from its data stores. The result isn't what you wanted. It's more akin to asking a child to explain why she didn't do her homework, and getting back an outrageous story constructed in the moment. That may seem unfair since generative AI does not actually try to deceive unless it's been trained to. But the point is, ... if it doesn't know, how could you?

This economic model has already proven to be ridiculously profitable for companies like OpenAI, Google, Adobe, et cetera.  They devour information at near zero cost, create a massive bowl of generative AI stew, and rent you a spoon.  Where would your search for knowledge have taken you, if not to them?  Where would that money in your subscription fee have gone, if not to them?  It's in the interest of those companies that you be prevented from knowing. Your dependency on them grows. The health of the information marketplace and the cultural landscape declines. Welcome to the information mafia.

Postscript:

Is there any way to avert this future? Should we?

We thoroughly regulate the form of machines that transport humans, in order to save lives. We regulate the content of public school curriculums according to well-established laws, for example those covering the establishment clause of the first amendment. So regulating devices and regulating information content is something we're used to doing.

But now there is a machine that can ingest a copyrighted work, and spit out a derivation of that work that leverages the content, while also completely concealing the act of ingesting. How do you enforce a law against something that you can never prove happened?
luzribeiro: (Default)
[personal profile] luzribeiro posting in [community profile] talkpolitics
I'm all for smart guardrails that help us harness AI safely without suffocating innovation. Now, the US has been highly reactive (with over 550 AI‑related bills in 45 states) but lacks cohesive federal direction. Meanwhile, the EU’s sweeping “AI Act” sets high standards but could overburden smaller innovators:
https://www.wired.com/story/plaintext-sam-altman-ai-regulation-trump/
https://www.mdpi.com/2078-2489/14/12/645
https://time.com/7213096/uk-public-ai-law-poll/

So, how about:

Targeted regulation: Instead of painting AI with one brush, focus on where the risks lie, like bias in hiring tools or misuse in facial recognition.

Outcome over technology: Don’t regulate the tech itself; regulate its applications.

Enforceable rules: We need real teeth - clear accountability, not toothless charters.

Bottom line: What we need is fine‑tuned, enforceable, risk‑adaptive policies, so AI can thrive while protecting people.

Thoughts?

nonstop non-story nonsense

Jul. 11th, 2025 02:52 pm
oportet: (Default)
[personal profile] oportet posting in [community profile] talkpolitics
So....

The 'list' of Epstein 'clients' that existed before, never existed.

The security camera footage outside his cell that didn't exist before - does exist (and rumors are it's been altered)

Now there's also rumors of a little they-go-or-i'm-gone between a few highers-ups in the Trump administration (bondi vs bongino)

Someone resigning or getting fired seems inevitable at this point - who do you think it will be?

Who do you believe Epstein was? A disgraced financier with a sick side job? C.i.a? Mossad? All of the above?

If you were an advisor to Trump, would you advise him to say nothing, do nothing, and wait for it to go away?

Or is this not going away?

Friday off-topic: Origami in space!

Jul. 11th, 2025 09:35 am
fridi: (Default)
[personal profile] fridi posting in [community profile] talkpolitics
Sometimes, people ponder profound questions like: "What's the meaning of life?", "Are we alone in the Universe?", or "What happens if you throw a paper airplane from the International Space Station)?"

Luckily, we now have an answer to the third one, because eventually, someone was bound to try it:

Could a Paper Plane Thrown From The International Space Station Survive The Flight?

Oh look, there's even an illustration! And it looks very science-y:



And a Youtube video of course:

VIDEO

Consider yourselves educated. You're welcome!

The global Fentanyl crisis

Jul. 7th, 2025 10:43 pm
airiefairie: (Default)
[personal profile] airiefairie posting in [community profile] talkpolitics
Over the last decade, fentanyl and its analogs, extremely potent synthetic opioids, have overtaken prescription painkillers and heroin to become the leading cause of overdose deaths, especially in North America. In the US, synthetic opioids now account for the majority of drug-related fatalities among those ages 18–45, with over 100,000 overdose deaths in 2021 and 2022 alone.
https://www.brookings.edu/articles/the-north-american-fentanyl-crisis-and-the-spread-of-synthetic-opioids/

These lethal drugs don't just appear out of nowhere. Trafficking networks orchestrate a 3-part supply chain: precursor chemicals (largely sourced from China) fuel large scale laboratory production, often in Mexico and Canada, before distribution into the US and beyond. A major Reuters investigation found that China's lax chemical regulations have enabled these precursors to be shipped widely, sustaining the epidemic.
https://www.reuters.com/investigates/special-report/drugs-fentanyl-supplychain/

Read more... )
kiaa: (3d)
[personal profile] kiaa posting in [community profile] talkpolitics
Mankind is wrong since 1905 ― Experts detect the real dimensions of time

This guy has an interesting theory, and claims this time it can be tested. In a nutshell, he argues time is three dimensional and space is just a consequence of that, so a byproduct. Which in essence turns General Relativity on its head. Literally.

The three dimensions of time he posits:

1. The first line, which would be the direct line that we know.
2. The second would allow us to access alternative versions of the same moment. It would be as if we could revisit an ordinary day and discover what it would have been like if we had made different choices.
3. And the third temporal dimension would allow the transition between these different possibilities.
Page generated Jul. 17th, 2025 09:40 am
Powered by Dreamwidth Studios