

Mostly connections and luck based on the trash I hear that’s somehow popular.
/u/outwrangle before everything went to shit in 2020, /u/emma_lazarus for a while after that, now I’m all queermunist!
Mostly connections and luck based on the trash I hear that’s somehow popular.
Or I could just use the dash - way easier.
And it doesn’t make me look like a robot.
In the comic, the desire path happened because they wouldn’t just connect to the crosswalk. Even in the end, the constructed path connects to the sidewalk slightly to the side instead of going straight to the crosswalk.
Desire paths happen because of poor, antihuman design choices.
You say this is human nature, but I see this comments section filled with good little doggies that follow master’s rules.
What the hell am I looking at? What the fuck is going on with those diagonal arrows??
My keyboard does not have an em dash and I have never seen one that does.
Still sus. 🤔
Humans just use dashes - they get the point across and don’t require esoteric button presses.
People are organizing in their communities to patrol for ICE and respond to ICE raids, which is quite a bit like the Black Panthers “cop watch” and community patrols.
I know it’s not like when protesters burned down a police station in 2020, but things are happening.
Why do people even like daylight? It just gives you wrinkles and cancer, that shit sucks.
Right, that’s the point of the “taxpayer” dog whistle. It trains people to think that poor people don’t pay taxes, and implies they’re lesser members of society because of it.
They certainly pay sales taxes, possibly property taxes too, they probably have to pay fines because the cops love to prey on the poor, and there’s also fees to use government services. That’s all taxes.
But there’s this concentrated attempt to denigrate people with lower income as useless eaters that don’t contribute to society, and so they don’t think any of that counts.
They don’t believe anyone under a certain income pays taxes. “Taxpayer” just means “upper income tax bracket” in their mind.
My definition of artificial is a system that was consciously engineered by humans.
And humans consciously decided what data to include, consciously created most of the data themselves, and consciously annotated the data for training. Conscious decisions are all over the dataset, even if they didn’t design the neural network directly from the ground up. The system still evolved from conscious inputs, you can’t erase its roots and call it natural.
Human-like object concept representations emerge from datasets made by humans because humans made them.
Bath water flavor.
I’m saying that the terms “natural” and “artificial” are in a dialectical relationship, they define each other by their contradictions. Those words don’t mean anything once you include everything humans do as natural; you’ve effectively defined “artificial” out of existence and as a result also defined “natural” out of existence.
If we define human inputs as “natural” then the word basically ceases to mean anything.
It’s the equivalent of saying that paintings and sculptures emerge naturally because artists are human and humans are natural.
LLMs create a useful representation of the world that is similar to our own when we feed them our human created+human curated+human annotated data. This doesn’t tell us much about the nature of large language models nor the nature of object concept representations, what it tells us is that human inputs result in human-like outputs.
Claims about “nature” are much broader than the findings warrant. We’d need to see LLMs fed entirely non-human datasets (no human creation, no human curation, no human annotation) before we could make claims about what emerges naturally.
I’m not disputing this, but I also don’t see why that’s important.
What’s important the use of “natural” here, because it implies something fundamental about language and material reality, rather than this just being a reflection of the human data fed into the model. You did it yourself when you said:
If you evolved a neural network on raw data from the environment, it would eventually start creating similar types of representations as well because it’s an efficient way to model the world.
And we just don’t know this, and this paper doesn’t demonstrate this because (as I’ve said) we aren’t feeding the LLMs raw data from the environment. We’re feeding them inputs from humans and then they’re displaying human-like outputs.
Did you actually read through the paper?
From the paper:
to what extent can complex, task-general psychological representations emerge without explicit task-specific training, and how do these compare to human cognitive processes across abroad range of tasks and domains?
But their training is still a data set picked by humans and given textual descriptions made by humans and then used a representation learning method previously designed for human participants. That’s not “natural”, that’s human.
A more accurate conclusion would be: human-like object concept representations emerge when fed data collected by humans, curated by humans, annotated by humans, and then tested by representation learning methods designed for humans.
human in ➡️ human out
I didn’t say they’re encoding raw data from nature
Ultimately the data both human brains and artificial neural networks are trained on comes from the material reality we inhabit.
Anyway, the data they are getting not only comes in a human format. The data we record is only recorded because we find meaningful as humans and most of the data is generated entirely by humans besides. You can’t separate these things; they’re human-like because they’re human-based.
It’s not merely natural. It’s human.
If you evolved a neural network on raw data from the environment, it would eventually start creating similar types of representations as well because it’s an efficient way to model the world.
We don’t know that.
We know that LLMs, when fed human-like inputs, produce human-like outputs. That’s it. That tells us more about LLMs and humans than it tells us about nature itself.
Just shovel them into my mouth. 👀