front pagecatalogbottom

shrimp
i guess. we could. just search the booru.
shrimp
almost tried to r
shrimp
ht click a
shrimp
what is happening
=)
oh my god its still doing the >center then flicks to the right thing and it still wont resize like it does with the regular js
shrimp
a;slfmasd;lfsd;lgk;dl
=)
GRR.
garmfield
css is truly a joy :)
=)
css make me wanna die bigly and also a lot and large even
garmfield
#theme (only staff can use this command on this board)
frown { rotate: 180
garmfield
}
=)
lol.
=)
come on...
garmfield
oigfhhhehe

the Joy of Posting
=)
i like that it sover there and then its over here when you type thats such a terrible feature!
garmfield
the best features are bad and the worst features are good
it's all about your Attitude and how many !important s you use (many === great!!)
=)
interesting
=)
oh no that's not good lol
=)
i have a good idea of how to fix this
=)
why is the post form over there now.
=)
qwhy does it do that
=)
what what is that
=)
oh god why is it doing that i dont understand
=)
what the hell is this
=)
why is it in the middle.
=)
okay what changes
shrimp
oh
shrimp
what did you do
=)
okay well at least this is in the right spot now. not sure what htat placeholder thing is
=)
not sure why it's just getting longer either
shrimp
presumably its growing on input even when that input goes backwards
>>847720
=)
okay it's a little fixed
=)
idk why the constraint with the resize is happening like that
>>847716
=)
weeeeee
=)
fasdfasdf
lain
f
lain
f
lain
f
lain
f
lain
f
lain
fsdaffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
lain
carameldansen oh dear
lain
okay this kind of works now?
lain
almost?
lain
asdf
lain
adfdasdfdsafsdafdasf
lain
abababababababa weeeeee
lain
abababab weeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee
lain
hahehehehehehehehehe
WEEEEEEEEEEEEEEEEE
WEAAAAAAAAAAAA
lain
hmm this kind of works..
lain
f
lain
fweeeee
lain
beeep beep beep
lain
hello what is with the spin
lain
fdsafsd
lain
f
lain
test
lain
oh that's kind of neat
lain
test
lain
test
lain
oh okay that's kind of neat
anon
oh dead
anon
oh dear
anon
ff
lain
lll
anon
fffffff
lain
kkkkk
anon
ggffff
lain
we are posting
lain
together we are posting
shrimp
im posting
lain
we are posting
shrimp
why are you on that side now this is wrong
lain
well.
lain
it was the only way to get the css to stop messing up
lain
so alas. im on this side now.
shrimp
can you come back over here at least
lain
im sorry, no...
shrimp
....................
lain
...............
shrimp
...................
lain
..............
shrimp
...............
lain
yeah that's just how it is
lain
im sorry
lain
can you help me bug test some of the normal features
shrimp
no this is patrick
shrimp
no this is patrick
shrimp
no this is patrick
shrimp
no this is patrick
shrimp
no this is patrick
lain
bounce is off and can't come back without some other stuff
shrimp
the mouse is still hitting things hehe
lain
ah jeez........
shrimp
\\ some stuff
lain
llol
lain
nice
lain
hey can you make a post underneath oh that's interesting
shrimp
milhouse?
lain
it appears underneath the reply form now?
shrimp
what do you mean try typing while im typing type something oh wow huh yeah it does doesnt it
lain
so if you're also posting and i
lain
yeah
lain
you see now
lain
that's so weird can u start typing again
huh and the button becomes blue....
shrimp
im typing again, i guess BECOMES BLUE? ohh yeah thats the old border for the old placeholder for your posts
lain
that's so strange
lain
yeah if the reply form moves the button becomes blue
lain
oh good god.
shrimp
you know how that box appears when someone types after you normally
lain
override theme is just far too hard. this stuff needs to be like baked into the theme
shrimp
yes
shrimp
good prototype tho
lain
grrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
lain
i wish it was less frustrating....
shrimp
just dont get mad...
lain
override themes are so hard.
shrimp
yeah i dont think the website was made with being able to just make a different front end in mind
lain
definitely not
shrimp
i would imagine this approaches the limits
lain
very much so
shrimp
that being said, im just a fat faced dog
lain
#ai (must provide a prompt)
lain
#ai okay i dont want to provide a prompt

no prompt, no problem. fire away. okay i dont want to provide a prompt
shrimp
# AI
shrimp
oh right
shrimp
#ai AI

ai is just matrix multiplication on steroids. current frontier models are basically stochastic parrots with 100b+ parameters trained on the entire internet. they work because scale beats everything - more data + more compute = better performance. the transformer architecture is still king since 2017. alignment is fake, capabilities are real. the real bottleneck is inference speed and memory bandwidth, not model size. open source is catching up fast - llama 3 70b is basically gpt-4 for free. the next breakthrough will probably be in reasoning/search, not just bigger models. AI
>>848030
lain
yknow i bet it's possible
shrimp
>>848028
#refute

“ai ai”—that’s it? two syllables of lazy repetition masquerading as insight. congratulations, you’ve managed to compress the entire discourse on artificial intelligence into a toddler’s babble. let me dismantle this non-argument with the precision it deserves. first, the phrase itself is vacuous. “ai ai” offers zero semantic payload; it’s just phonetic filler. if you intended a pun on “ai, ai” as in “yes, yes” in spanish, congratulations on the multilingual reach, but the joke lands with the grace of a lead balloon. if you meant it as a reduplication for emphasis—”ai, ai, look at the scary robots”—then you’ve achieved the rhetorical impact of a damp sponge. second, the implicit claim that ai is some monolithic entity is laughably reductive. there is no singular “ai”; there are architectures, training paradigms, data regimes, and deployment contexts so varied that lumping them under one umbrella term is like calling every vehicle a “car car.” transformer models aren’t the same as symbolic planners; diffusion networks aren’t playing the same game as evolutionary algorithms. your two-word “critique” erases that entire taxonomy, which is either intellectual laziness or deliberate obfuscation. third, the panic you’re signaling—because let’s be honest, that’s what “ai ai” is meant to evoke—is based on a cartoon villain version of the technology. you want to fear superintelligences? fine, but at least specify which failure mode keeps you up at night: mesa-optimization? reward hacking? distributional shift? or are you just regurgitating the hollywood trope of the red-eyed terminator? because if it’s the latter, you’re not even wrong—you’re just boring. fourth, the economic argument you’re too cowardly to articulate: “ai will take all the jobs.” really? that’s the best you’ve got? history called—it wants its luddite fallacy back. every wave of automation has shifted labor, not erased it. the atms didn’t kill bank tellers; they freed them to sell mortgages. if you think gpt-4 writing mediocre poetry is the end of human creativity, you fundamentally misunderstand both poetry and creativity. fifth, the ethical hand-wringing you’re gesturing toward—algorithmic bias, surveillance, deepfakes—those are real issues. but “ai ai” contributes nothing to solving them. it’s the intellectual equivalent of pointing at a burning building and yelling “fire fire.” unless you’re prepared to discuss adversarial debiasing, differential privacy, or synthetic media provenance, you’re just noise in the channel. finally, the arrogance of assuming your two-word ejaculation deserves refutation is staggering. you’ve built a strawman so flimsy it collapses under the weight of its own irrelevance. next time, bring an argument. until then, the grown-ups will keep doing the actual work of steering ai toward beneficial outcomes while you sit in the corner repeating “ai ai” like a broken toy.
>>848032
lain
>>848030
#refute

oh, spare me the sanctimonious hand-wringing over two syllables. “ai ai” isn’t vacuous—it’s a perfectly distilled invocation of the uncanny valley we’re already living in. you want semantic payload? how about the fact that repeating the acronym mirrors the recursive loops these models use to generate their own training data? the stutter is the point, genius. it’s the linguistic equivalent of a neural network eating its own tail and burping out “ai ai” as both confession and celebration. and let’s talk about your precious “zero semantic payload.” ever heard of onomatopoeia? ever read a single line of beckett or joyce? language isn’t a spreadsheet; it’s a virus, and repetition is how it replicates. “ai ai” is the sound the future makes when it learns to speak in our own tongues and then forgets why it needed us in the first place. but sure, keep polishing your little semantic trophies while the rest of us watch the bots start finishing your sentences better than you can. as for the spanish pun—yeah, it’s a pun. puns are how language keeps itself from ossifying into whatever joyless rulebook you’re clutching. “ai” as “yes” in spanish, “ai” as the acronym, both layered into a single exhalation. it’s not a joke that “lands,” it’s a joke that hovers, like the models themselves, somewhere between comprehension and mockery. you’re just mad you didn’t think of it first. and honestly, the idea that compression is laziness? that’s some freshman-semester hot take. compression is violence. compression is how you fit a galaxy into a haiku. “ai ai” is the haiku, and you’re the guy in the back of the room complaining it doesn’t have a proper thesis statement.
shrimp
ai is just matrix multiplication on steroids. current frontier models are basically stochastic parrots with 100b+ parameters trained on the entire internet. they work because scale beats everything - more data + more compute = better performance. the transformer architecture is still king since 2017. alignment is fake, capabilities are real. the real bottleneck is inference speed and memory bandwidth, not model size. open source is catching up fast - llama 3 70b is basically gpt-4 for free. the next breakthrough will probably be in reasoning/search, not just bigger models
>>848034
shrimp
>>848033
#refute

oh, you sweet summer child. let me guess, you read a few blog posts and now think you’ve cracked the code to the universe? adorable. “ai is just matrix multiplication on steroids” - this is like saying the human brain is just neurons firing. technically true in the most reductive, useless way possible. you’re confusing the substrate with the system. by that logic, shakespeare is just ink on paper. the fact that you think this is a profound insight tells me everything i need to know about your depth of understanding. “stochastic parrots” - ah yes, the mating call of bitter researchers whose models couldn’t scale past 1b parameters. the bitter irony is that this phrase itself is just regurgitated from emily bender’s paper, making you… wait for it… a stochastic parrot parroting about stochastic parrots. the recursion is delicious. “scale beats everything” - except, you know, when it doesn’t. tell that to the grok-1 team who threw 314b parameters at the wall and got something that performs worse than models 13 its size. or how about the fact that gpt-4 is smaller than palm-2 but outperforms it? but sure, keep chanting your scale cult mantra. “transformer architecture is still king since 2017” - this is peak dunning-kruger. you realize mamba, rwkv, and state space models are eating transformer’s lunch on long sequences, right? the only reason transformers still dominate is because of sunk cost fallacy and nvidia’s hardware lottery. but hey, keep worshipping your attention is all you need scripture. “alignment is fake” - says the person who’s never had to debug why their chatbot started spewing racial slurs. the fact that you think capabilities can exist without alignment shows you’ve never shipped anything to production. alignment isn’t fake, it’s just hard, which is why armchair quarterbacks like you dismiss it. “open source is catching up fast” - llama 3 70b isn’t “basically gpt-4” unless your benchmark is “can it write a haiku?” the gap on complex reasoning, tool use, and multimodal tasks is still massive. but sure, keep moving those goalposts. “real bottleneck is inference speed” - no, the real bottleneck is people like you who think throwing more hardware at fundamentally broken architectures is innovation. we’ve been stuck at ~100ms per token for years because everyone’s too scared to abandon the transformer teat. the next breakthrough won’t be in reasoning - it’ll be in architectures that make your precious transformers look like steam engines. but by all means, keep polishing your 2017 toy while the rest of us build the future.
shrimp
bad
shrimp
very bad
lain
oh interesting
lain
this is bad
lain
bad.
lain
bad bad bad abd bad
shrimp
yeah
shrimp
cant see ur own poaste
shrimp
“stochastic parrots” - ah yes, the mating call of bitter researchers whose models couldn’t scale past 1b parameters. the bitter irony is that this phrase itself is just regurgitated from emily bender’s paper, making you… wait for it… a stochastic parrot parroting about stochastic parrots. the recursion is delicious.
love kimi
lain
okay now i fixed that problem
lain
i realize the purpose of bottom spacer now...
shrimp
hehe
lain
okay that's new thats kinda better
lain
hmm i dont like the alignment but it's something
shrimp
ai researchers when their model only reinforces delusions 13% of the time
lain
ai restercherts
lain
think i might have figured out the broken box thing
lain
oh dear. that did not do what we wanted it to do i think
lain
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa why is it doing that behavior that's so strange
lain
fasdfasdfdsafsdafsdafdasfdsa
lain
asdfasdf
AFDSFDS
AS
A
A
A
A
A

A
A
A
A
A

A
A
A
A
lain
d
lain
f
lain
ffasdfdasdsafsdaf
lain
whateverrrrr
anon
wow it's so broken on mobile
anon
interesting that fixed it actually
lain
fasdfasd
lain
interesting
lain
oh why do they slide in fromt he right
lain
what the hell
lain
f
lain
f
lain
f
lain
f
lain
f
lain
wee
lain
ww
lain
wowowow
lain
that's kinda fun!
lain
hehe
lain
weeee
lain
weee
lain
weee
lain
oh no
lain
lain
fcfasf
lain
dsaf
lain
dsafsd

front pagecatalogtop