• The Sietch will be brought offline for HPG systems maintenance tomorrow (Thursday, 2 May 2024). Please remain calm and do not start any interstellar wars while ComStar is busy. May the Peace of Blake be with you. Precentor Dune

AI/Automation Megathread

Morphic Tide

Well-known member
and the sort of rote work jobs that humans arguably should not be doing
The reliability problems they have make this extremely suspect, even the spectacular amount of work with self-driving cars shits itself with absurdities like "text of a thing equals the thing". Hence them using it only to parse the surroundings to feed to conventional automation with a variety of hardware backups.
 

Scottty

Well-known member
Founder
The reliability problems they have make this extremely suspect, even the spectacular amount of work with self-driving cars shits itself with absurdities like "text of a thing equals the thing". Hence them using it only to parse the surroundings to feed to conventional automation with a variety of hardware backups.

I wasn't talking about self-driving cars. Unless they've got one that can follow the hand-gestures of a traffic cop to divert across a field onto another road, because the one it's currently on has been blocked by a pile of burning tires and an angry mob, and do this under conditions of poor vision due to all the smoke, I'm really not interested.

No, I'm talking about factory stuff where things roll along on a conveyor belt and the bottle gets filled at one stop, then the lid put on at the next, then the wrapping with the logo at the next...
That's not work fit for human beings.
 

Agent23

Ни шагу назад!
ChatGPT will keep 'hallucinating' wrong answers for years to come and won't take off until it's on your cellphone, Morgan Stanley says

And here is the good old hype cycle for anyone who wants to see it again:

hype-cycle.png


We are in the max hype stage for a lot of these projects and companies because they are trying to justify their huge valuations and existence.

Everything will crash, the zomg to the moon baby story bubble will burst, and we might get 1-2 fun little curios or actually productive companies.
 

Cherico

Well-known member
ChatGPT will keep 'hallucinating' wrong answers for years to come and won't take off until it's on your cellphone, Morgan Stanley says

And here is the good old hype cycle for anyone who wants to see it again:

hype-cycle.png


We are in the max hype stage for a lot of these projects and companies because they are trying to justify their huge valuations and existence.

Everything will crash, the zomg to the moon baby story bubble will burst, and we might get 1-2 fun little curios or actually productive companies.

To be fair tiny incrimental gains are how we got this far.
 

Marnuplee

Well-known member
That reminds me, any tech-literate conservatives or libertarians you know of who’ve tried to make their own AI chatbots in response?

Would expect those to have biases of their own, for sure, but you can hardly have a more slanted platform embraced by the mainstream institutions and customer base as ChatGPT.
Andrew Torba, CEO of Gab.


 
Andrew Torba, CEO of Gab.



disaster-popcorn.gif
 

Agent23

Ни шагу назад!
Andrew Torba, CEO of Gab.


I wonder what Torba is drinking...
 

Agent23

Ни шагу назад!




this is pretty amazing stuff i cant wait to see what fan made and original content can be created. Small teams of creatives with low budgets could create some amazing stuff soon.

Did we invent filters that turn an image into a painting?

Oh, yeah, we had that a while back.
 

hyperspacewizard

Well-known member
Did we invent filters that turn an image into a painting?

Oh, yeah, we had that a while back.
It’s a little more involved and creative than that watch the videos and really think about what smaller teams can pull off with this style of techniques. I personally liked the way they used bought 3d assets and got their trained model to transform the backgrounds into a style that made sense with the characters. This could just as easily be used to create sci-fi shows or any other style and it isn’t rotoscoping either the ai is drawing complete images from a live action recording but the images aren’t being traced over so the final product is animation.
 

Marnuplee

Well-known member
Gab now have an AI that can generate a movie.

 

Typhonis

Well-known member
Gab now have an AI that can generate a movie.

I give it a week before someone makes a porno.
 

Bear Ribs

Well-known member
Gab now have an AI that can generate a movie.

It's hilarious to read back in this thread and realize that just a month ago, we had pundits in this very thread nodding sagely and explaining to us that making a movie was thousands, perhaps millions of times harder than chatting or making images and well beyond what AI would be able to do in the foreseeable future.
 

Morphic Tide

Well-known member
It's hilarious to read back in this thread and realize that just a month ago, we had pundits in this very thread nodding sagely and explaining to us that making a movie was thousands, perhaps millions of times harder than chatting or making images and well beyond what AI would be able to do in the foreseeable future.
The post being linked demonstrates incredible difficulty with stability for static scenes, with the shift having nearly total change of the "planets" arriving between frames, with a total of thirty seconds shown. Same problem crops up in Disturbed's Bad Man video, and every other AI animation I've seen, they are all hellishly unstable to the point of being nearly totally worthless for producing conventional content.

Except the AI Seinfield, but that goes to extreme lengths to reduce the complexity to the point its barely worth considering for end-use and still has visible consistency errors like a "box" with an edge that switches from concave to convex mid-scene. Provided that the "clipping" behavior and certain other artifacts aren't the result of using "dolls" to completely bypass a lot of the difficulty instead of direct AI generation of footage.

I lived through the 80s. I learned the hard way not to say "It's beyond possibility that computers can...."
The difference is that we're running out of curve on silicone and the current methodology has little potential for optimizing the problem at the scale worried about, so continued progression is the domain of wholly new technology. Maybe quantum computing will leave the "coming soon!" it's been in for over a decade in the next five years, maybe dedicated architecture magic will bridge the gap with existing manufacturing like it warped cryptocurrency, maybe a new AI methodology will bypass the technical challenges.

But because we physically can't brute-force it by throwing more transistors at it, it is uncertain. This is "Fusion is 10 years away!" for the last 50 years thinking, that because visible progress is being made it's obviously close to being sold to the end-user. Maybe it's 5 years, maybe it's 10, maybe the technical challenges will keep cropping up for the next 50 years like they have for fusion.
 

Doomsought

Well-known member
The difference is that we're running out of curve on silicone and the current methodology has little potential for optimizing the problem at the scale worried about, so continued progression is the domain of wholly new technology. Maybe quantum computing will leave the "coming soon!" it's been in for over a decade in the next five years, maybe dedicated architecture magic will bridge the gap with existing manufacturing like it warped cryptocurrency, maybe a new AI methodology will bypass the technical challenges.

But because we physically can't brute-force it by throwing more transistors at it, it is uncertain. This is "Fusion is 10 years away!" for the last 50 years thinking, that because visible progress is being made it's obviously close to being sold to the end-user. Maybe it's 5 years, maybe it's 10, maybe the technical challenges will keep cropping up for the next 50 years like they have for fusion.
There is a lot of misinformation about quantum computing. It is not a replacement for traditional CPUs or even array processors (GPUs). They are only really useful for a selection of statistical computational tasks. Quantum computing just has too much error inherent in the system to be used for things like banking or even running an operating system. At peak advancement they will go the same way as array computing: becoming an accessory module that can be slotted into a computer and slaved to a traditional CPU.

I've also seen the fallacy of brute forcing with more transistors first hand. For example sargability is far more important than computer power, I've seen it provide multiple orders of magnitude of performance improvement. From the point of view of sargability, a neural network resembles several simultaneous recursive index tree search with a statistical match fit output added into the process.
 

Users who are viewing this thread

Top