April 2025

demilypyro:

demilypyro:

demilypyro:

gonna start saying insane shit like fpreg and girlpussy just to normalize it

love all my cgirl and cboy followers and my normal followers too

do you really need to be cis? you could just be a more masculine trans girl

starlitsheep:

they’re gonna get pushed out of the car LMAO

kittybroker:

poorly-identifying-cats-in-posts:

little-goblin-girl:

plastidecor:

plastidecor:

Kitty masterpost

PIE AND HER BABIES!

Do not be fooled! These are not delicious foods!!! They are cats ✓!!! Do not try to eat!

Feast your eyes on these great prices! A collection of tasty treats delectable and safe for consumption! Pick up one of these for only $65.95 a set!

kittybroker:

poorly-identifying-cats-in-posts:

little-goblin-girl:

plastidecor:

plastidecor:

Kitty masterpost

PIE AND HER BABIES!

Do not be fooled! These are not delicious foods!!! They are cats ✓!!! Do not try to eat!

Feast your eyes on these great prices! A collection of tasty treats delectable and safe for consumption! Pick up one of these for only $65.95 a set!

memingursa:

“Biden/Kamala would have been better”

phantomrose96:

anticorn-lawleag-ue:

aiweirdness:

“Slopsquatting” in a nutshell:


1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but

2. The packages don’t exist. Which would normally cause an error but

3. Nefarious people have made malware under the package names that LLMs make up most often. So

4. Now the LLM code points to malware.

https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/

I’m sitting around waiting for the rain to pass so I can go home from my job as a programmer who uses open source software.

And since I’m waiting I decided to contribute an over-simplified analogy to explain this.

Analogy: You’re in charge of running a kitchen. You and your staff create recipes and sell the meals you make. Inevitably though your recipes will call for things like “a stove” and “a blender” which you and your staff would not want to create from scratch.

Luckily “a stove” and “a blender” are things you can acquire and not try to make from scratch. You and your staff, as humans, are capable of recognizing real appliances, and getting them from real sources.

(There is actually an existing threat where “a stov” is a malicious thing, created by someone who knows “a stove” is in hot demand and is trying to take advantage of someone who might typo when ordering “a stove”. There are some safe-guards in this space, but not 100% guarded.)

But now there’s Cooking AI that can run your kitchen for you 🙂. It can write your recipes, order the necessities, and assemble the dish for you 🙃. Your boss fires you and your staff and just uses the Cooking AI.

The AI, in its infinite wisdom, starts writing recipes that call to be cooked on “a hotcob”. It writes recipes that call for the ingredients to be assembled in “a produceslicer”. These are not real things. And usually when the AI tries this, the process will error out because it fails at the process of acquiring the hotcob or the produceslicer.

But the kinds of people already profiting off supplying “a stov” take notice. AI likes to request these appliances frequently. The retailer offering “a stov” starts offering “a hotcob” and “a produceslicer”. Now these AI-automated chefs succeed because their recipe order comes together!

“A hotcob” adds liquid mercury to all the dishes. “A produceslicer” hacks your wifi and steals all of the business’s information. This is allowed because the AI chef welcomed these things in, signed for them, and hooked them up.

Really good. All of this. Great job, AI.

phantomrose96:

anticorn-lawleag-ue:

aiweirdness:

“Slopsquatting” in a nutshell:


1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but

2. The packages don’t exist. Which would normally cause an error but

3. Nefarious people have made malware under the package names that LLMs make up most often. So

4. Now the LLM code points to malware.

https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/

I’m sitting around waiting for the rain to pass so I can go home from my job as a programmer who uses open source software.

And since I’m waiting I decided to contribute an over-simplified analogy to explain this.

Analogy: You’re in charge of running a kitchen. You and your staff create recipes and sell the meals you make. Inevitably though your recipes will call for things like “a stove” and “a blender” which you and your staff would not want to create from scratch.

Luckily “a stove” and “a blender” are things you can acquire and not try to make from scratch. You and your staff, as humans, are capable of recognizing real appliances, and getting them from real sources.

(There is actually an existing threat where “a stov” is a malicious thing, created by someone who knows “a stove” is in hot demand and is trying to take advantage of someone who might typo when ordering “a stove”. There are some safe-guards in this space, but not 100% guarded.)

But now there’s Cooking AI that can run your kitchen for you 🙂. It can write your recipes, order the necessities, and assemble the dish for you 🙃. Your boss fires you and your staff and just uses the Cooking AI.

The AI, in its infinite wisdom, starts writing recipes that call to be cooked on “a hotcob”. It writes recipes that call for the ingredients to be assembled in “a produceslicer”. These are not real things. And usually when the AI tries this, the process will error out because it fails at the process of acquiring the hotcob or the produceslicer.

But the kinds of people already profiting off supplying “a stov” take notice. AI likes to request these appliances frequently. The retailer offering “a stov” starts offering “a hotcob” and “a produceslicer”. Now these AI-automated chefs succeed because their recipe order comes together!

“A hotcob” adds liquid mercury to all the dishes. “A produceslicer” hacks your wifi and steals all of the business’s information. This is allowed because the AI chef welcomed these things in, signed for them, and hooked them up.

Really good. All of this. Great job, AI.

phantomrose96:

anticorn-lawleag-ue:

aiweirdness:

“Slopsquatting” in a nutshell:


1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but

2. The packages don’t exist. Which would normally cause an error but

3. Nefarious people have made malware under the package names that LLMs make up most often. So

4. Now the LLM code points to malware.

https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/

I’m sitting around waiting for the rain to pass so I can go home from my job as a programmer who uses open source software.

And since I’m waiting I decided to contribute an over-simplified analogy to explain this.

Analogy: You’re in charge of running a kitchen. You and your staff create recipes and sell the meals you make. Inevitably though your recipes will call for things like “a stove” and “a blender” which you and your staff would not want to create from scratch.

Luckily “a stove” and “a blender” are things you can acquire and not try to make from scratch. You and your staff, as humans, are capable of recognizing real appliances, and getting them from real sources.

(There is actually an existing threat where “a stov” is a malicious thing, created by someone who knows “a stove” is in hot demand and is trying to take advantage of someone who might typo when ordering “a stove”. There are some safe-guards in this space, but not 100% guarded.)

But now there’s Cooking AI that can run your kitchen for you 🙂. It can write your recipes, order the necessities, and assemble the dish for you 🙃. Your boss fires you and your staff and just uses the Cooking AI.

The AI, in its infinite wisdom, starts writing recipes that call to be cooked on “a hotcob”. It writes recipes that call for the ingredients to be assembled in “a produceslicer”. These are not real things. And usually when the AI tries this, the process will error out because it fails at the process of acquiring the hotcob or the produceslicer.

But the kinds of people already profiting off supplying “a stov” take notice. AI likes to request these appliances frequently. The retailer offering “a stov” starts offering “a hotcob” and “a produceslicer”. Now these AI-automated chefs succeed because their recipe order comes together!

“A hotcob” adds liquid mercury to all the dishes. “A produceslicer” hacks your wifi and steals all of the business’s information. This is allowed because the AI chef welcomed these things in, signed for them, and hooked them up.

Really good. All of this. Great job, AI.

phantomrose96:

anticorn-lawleag-ue:

aiweirdness:

“Slopsquatting” in a nutshell:


1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but

2. The packages don’t exist. Which would normally cause an error but

3. Nefarious people have made malware under the package names that LLMs make up most often. So

4. Now the LLM code points to malware.

https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/

I’m sitting around waiting for the rain to pass so I can go home from my job as a programmer who uses open source software.

And since I’m waiting I decided to contribute an over-simplified analogy to explain this.

Analogy: You’re in charge of running a kitchen. You and your staff create recipes and sell the meals you make. Inevitably though your recipes will call for things like “a stove” and “a blender” which you and your staff would not want to create from scratch.

Luckily “a stove” and “a blender” are things you can acquire and not try to make from scratch. You and your staff, as humans, are capable of recognizing real appliances, and getting them from real sources.

(There is actually an existing threat where “a stov” is a malicious thing, created by someone who knows “a stove” is in hot demand and is trying to take advantage of someone who might typo when ordering “a stove”. There are some safe-guards in this space, but not 100% guarded.)

But now there’s Cooking AI that can run your kitchen for you 🙂. It can write your recipes, order the necessities, and assemble the dish for you 🙃. Your boss fires you and your staff and just uses the Cooking AI.

The AI, in its infinite wisdom, starts writing recipes that call to be cooked on “a hotcob”. It writes recipes that call for the ingredients to be assembled in “a produceslicer”. These are not real things. And usually when the AI tries this, the process will error out because it fails at the process of acquiring the hotcob or the produceslicer.

But the kinds of people already profiting off supplying “a stov” take notice. AI likes to request these appliances frequently. The retailer offering “a stov” starts offering “a hotcob” and “a produceslicer”. Now these AI-automated chefs succeed because their recipe order comes together!

“A hotcob” adds liquid mercury to all the dishes. “A produceslicer” hacks your wifi and steals all of the business’s information. This is allowed because the AI chef welcomed these things in, signed for them, and hooked them up.

Really good. All of this. Great job, AI.

phantomrose96:

anticorn-lawleag-ue:

aiweirdness:

“Slopsquatting” in a nutshell:


1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but

2. The packages don’t exist. Which would normally cause an error but

3. Nefarious people have made malware under the package names that LLMs make up most often. So

4. Now the LLM code points to malware.

https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/

I’m sitting around waiting for the rain to pass so I can go home from my job as a programmer who uses open source software.

And since I’m waiting I decided to contribute an over-simplified analogy to explain this.

Analogy: You’re in charge of running a kitchen. You and your staff create recipes and sell the meals you make. Inevitably though your recipes will call for things like “a stove” and “a blender” which you and your staff would not want to create from scratch.

Luckily “a stove” and “a blender” are things you can acquire and not try to make from scratch. You and your staff, as humans, are capable of recognizing real appliances, and getting them from real sources.

(There is actually an existing threat where “a stov” is a malicious thing, created by someone who knows “a stove” is in hot demand and is trying to take advantage of someone who might typo when ordering “a stove”. There are some safe-guards in this space, but not 100% guarded.)

But now there’s Cooking AI that can run your kitchen for you 🙂. It can write your recipes, order the necessities, and assemble the dish for you 🙃. Your boss fires you and your staff and just uses the Cooking AI.

The AI, in its infinite wisdom, starts writing recipes that call to be cooked on “a hotcob”. It writes recipes that call for the ingredients to be assembled in “a produceslicer”. These are not real things. And usually when the AI tries this, the process will error out because it fails at the process of acquiring the hotcob or the produceslicer.

But the kinds of people already profiting off supplying “a stov” take notice. AI likes to request these appliances frequently. The retailer offering “a stov” starts offering “a hotcob” and “a produceslicer”. Now these AI-automated chefs succeed because their recipe order comes together!

“A hotcob” adds liquid mercury to all the dishes. “A produceslicer” hacks your wifi and steals all of the business’s information. This is allowed because the AI chef welcomed these things in, signed for them, and hooked them up.

Really good. All of this. Great job, AI.

phantomrose96:

anticorn-lawleag-ue:

aiweirdness:

“Slopsquatting” in a nutshell:


1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but

2. The packages don’t exist. Which would normally cause an error but

3. Nefarious people have made malware under the package names that LLMs make up most often. So

4. Now the LLM code points to malware.

https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/

I’m sitting around waiting for the rain to pass so I can go home from my job as a programmer who uses open source software.

And since I’m waiting I decided to contribute an over-simplified analogy to explain this.

Analogy: You’re in charge of running a kitchen. You and your staff create recipes and sell the meals you make. Inevitably though your recipes will call for things like “a stove” and “a blender” which you and your staff would not want to create from scratch.

Luckily “a stove” and “a blender” are things you can acquire and not try to make from scratch. You and your staff, as humans, are capable of recognizing real appliances, and getting them from real sources.

(There is actually an existing threat where “a stov” is a malicious thing, created by someone who knows “a stove” is in hot demand and is trying to take advantage of someone who might typo when ordering “a stove”. There are some safe-guards in this space, but not 100% guarded.)

But now there’s Cooking AI that can run your kitchen for you 🙂. It can write your recipes, order the necessities, and assemble the dish for you 🙃. Your boss fires you and your staff and just uses the Cooking AI.

The AI, in its infinite wisdom, starts writing recipes that call to be cooked on “a hotcob”. It writes recipes that call for the ingredients to be assembled in “a produceslicer”. These are not real things. And usually when the AI tries this, the process will error out because it fails at the process of acquiring the hotcob or the produceslicer.

But the kinds of people already profiting off supplying “a stov” take notice. AI likes to request these appliances frequently. The retailer offering “a stov” starts offering “a hotcob” and “a produceslicer”. Now these AI-automated chefs succeed because their recipe order comes together!

“A hotcob” adds liquid mercury to all the dishes. “A produceslicer” hacks your wifi and steals all of the business’s information. This is allowed because the AI chef welcomed these things in, signed for them, and hooked them up.

Really good. All of this. Great job, AI.

phantomrose96:

anticorn-lawleag-ue:

aiweirdness:

“Slopsquatting” in a nutshell:


1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but

2. The packages don’t exist. Which would normally cause an error but

3. Nefarious people have made malware under the package names that LLMs make up most often. So

4. Now the LLM code points to malware.

https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/

I’m sitting around waiting for the rain to pass so I can go home from my job as a programmer who uses open source software.

And since I’m waiting I decided to contribute an over-simplified analogy to explain this.

Analogy: You’re in charge of running a kitchen. You and your staff create recipes and sell the meals you make. Inevitably though your recipes will call for things like “a stove” and “a blender” which you and your staff would not want to create from scratch.

Luckily “a stove” and “a blender” are things you can acquire and not try to make from scratch. You and your staff, as humans, are capable of recognizing real appliances, and getting them from real sources.

(There is actually an existing threat where “a stov” is a malicious thing, created by someone who knows “a stove” is in hot demand and is trying to take advantage of someone who might typo when ordering “a stove”. There are some safe-guards in this space, but not 100% guarded.)

But now there’s Cooking AI that can run your kitchen for you 🙂. It can write your recipes, order the necessities, and assemble the dish for you 🙃. Your boss fires you and your staff and just uses the Cooking AI.

The AI, in its infinite wisdom, starts writing recipes that call to be cooked on “a hotcob”. It writes recipes that call for the ingredients to be assembled in “a produceslicer”. These are not real things. And usually when the AI tries this, the process will error out because it fails at the process of acquiring the hotcob or the produceslicer.

But the kinds of people already profiting off supplying “a stov” take notice. AI likes to request these appliances frequently. The retailer offering “a stov” starts offering “a hotcob” and “a produceslicer”. Now these AI-automated chefs succeed because their recipe order comes together!

“A hotcob” adds liquid mercury to all the dishes. “A produceslicer” hacks your wifi and steals all of the business’s information. This is allowed because the AI chef welcomed these things in, signed for them, and hooked them up.

Really good. All of this. Great job, AI.

phantomrose96:

anticorn-lawleag-ue:

aiweirdness:

“Slopsquatting” in a nutshell:


1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but

2. The packages don’t exist. Which would normally cause an error but

3. Nefarious people have made malware under the package names that LLMs make up most often. So

4. Now the LLM code points to malware.

https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/

I’m sitting around waiting for the rain to pass so I can go home from my job as a programmer who uses open source software.

And since I’m waiting I decided to contribute an over-simplified analogy to explain this.

Analogy: You’re in charge of running a kitchen. You and your staff create recipes and sell the meals you make. Inevitably though your recipes will call for things like “a stove” and “a blender” which you and your staff would not want to create from scratch.

Luckily “a stove” and “a blender” are things you can acquire and not try to make from scratch. You and your staff, as humans, are capable of recognizing real appliances, and getting them from real sources.

(There is actually an existing threat where “a stov” is a malicious thing, created by someone who knows “a stove” is in hot demand and is trying to take advantage of someone who might typo when ordering “a stove”. There are some safe-guards in this space, but not 100% guarded.)

But now there’s Cooking AI that can run your kitchen for you 🙂. It can write your recipes, order the necessities, and assemble the dish for you 🙃. Your boss fires you and your staff and just uses the Cooking AI.

The AI, in its infinite wisdom, starts writing recipes that call to be cooked on “a hotcob”. It writes recipes that call for the ingredients to be assembled in “a produceslicer”. These are not real things. And usually when the AI tries this, the process will error out because it fails at the process of acquiring the hotcob or the produceslicer.

But the kinds of people already profiting off supplying “a stov” take notice. AI likes to request these appliances frequently. The retailer offering “a stov” starts offering “a hotcob” and “a produceslicer”. Now these AI-automated chefs succeed because their recipe order comes together!

“A hotcob” adds liquid mercury to all the dishes. “A produceslicer” hacks your wifi and steals all of the business’s information. This is allowed because the AI chef welcomed these things in, signed for them, and hooked them up.

Really good. All of this. Great job, AI.

phantomrose96:

anticorn-lawleag-ue:

aiweirdness:

“Slopsquatting” in a nutshell:


1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but

2. The packages don’t exist. Which would normally cause an error but

3. Nefarious people have made malware under the package names that LLMs make up most often. So

4. Now the LLM code points to malware.

https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/

I’m sitting around waiting for the rain to pass so I can go home from my job as a programmer who uses open source software.

And since I’m waiting I decided to contribute an over-simplified analogy to explain this.

Analogy: You’re in charge of running a kitchen. You and your staff create recipes and sell the meals you make. Inevitably though your recipes will call for things like “a stove” and “a blender” which you and your staff would not want to create from scratch.

Luckily “a stove” and “a blender” are things you can acquire and not try to make from scratch. You and your staff, as humans, are capable of recognizing real appliances, and getting them from real sources.

(There is actually an existing threat where “a stov” is a malicious thing, created by someone who knows “a stove” is in hot demand and is trying to take advantage of someone who might typo when ordering “a stove”. There are some safe-guards in this space, but not 100% guarded.)

But now there’s Cooking AI that can run your kitchen for you 🙂. It can write your recipes, order the necessities, and assemble the dish for you 🙃. Your boss fires you and your staff and just uses the Cooking AI.

The AI, in its infinite wisdom, starts writing recipes that call to be cooked on “a hotcob”. It writes recipes that call for the ingredients to be assembled in “a produceslicer”. These are not real things. And usually when the AI tries this, the process will error out because it fails at the process of acquiring the hotcob or the produceslicer.

But the kinds of people already profiting off supplying “a stov” take notice. AI likes to request these appliances frequently. The retailer offering “a stov” starts offering “a hotcob” and “a produceslicer”. Now these AI-automated chefs succeed because their recipe order comes together!

“A hotcob” adds liquid mercury to all the dishes. “A produceslicer” hacks your wifi and steals all of the business’s information. This is allowed because the AI chef welcomed these things in, signed for them, and hooked them up.

Really good. All of this. Great job, AI.

phantomrose96:

anticorn-lawleag-ue:

aiweirdness:

“Slopsquatting” in a nutshell:


1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but

2. The packages don’t exist. Which would normally cause an error but

3. Nefarious people have made malware under the package names that LLMs make up most often. So

4. Now the LLM code points to malware.

https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/

I’m sitting around waiting for the rain to pass so I can go home from my job as a programmer who uses open source software.

And since I’m waiting I decided to contribute an over-simplified analogy to explain this.

Analogy: You’re in charge of running a kitchen. You and your staff create recipes and sell the meals you make. Inevitably though your recipes will call for things like “a stove” and “a blender” which you and your staff would not want to create from scratch.

Luckily “a stove” and “a blender” are things you can acquire and not try to make from scratch. You and your staff, as humans, are capable of recognizing real appliances, and getting them from real sources.

(There is actually an existing threat where “a stov” is a malicious thing, created by someone who knows “a stove” is in hot demand and is trying to take advantage of someone who might typo when ordering “a stove”. There are some safe-guards in this space, but not 100% guarded.)

But now there’s Cooking AI that can run your kitchen for you 🙂. It can write your recipes, order the necessities, and assemble the dish for you 🙃. Your boss fires you and your staff and just uses the Cooking AI.

The AI, in its infinite wisdom, starts writing recipes that call to be cooked on “a hotcob”. It writes recipes that call for the ingredients to be assembled in “a produceslicer”. These are not real things. And usually when the AI tries this, the process will error out because it fails at the process of acquiring the hotcob or the produceslicer.

But the kinds of people already profiting off supplying “a stov” take notice. AI likes to request these appliances frequently. The retailer offering “a stov” starts offering “a hotcob” and “a produceslicer”. Now these AI-automated chefs succeed because their recipe order comes together!

“A hotcob” adds liquid mercury to all the dishes. “A produceslicer” hacks your wifi and steals all of the business’s information. This is allowed because the AI chef welcomed these things in, signed for them, and hooked them up.

Really good. All of this. Great job, AI.

Windows 7

armor-piercing-cumshot:

demilypyro:

biggreenplanet:

amyashen:

cabybapa:

coreytasticc:

super-metroid:

metalbatteryzone:

kittycatblast:

charredasperity:

pissvortex:

constructreese:

cowwatcher:

nyxwoven:

thetabbybadger:

thatchickatthecomicbookstore:

sometimesthebestteacupischipped:

meashre:

nadhie:

itsrapsodia:

chillyfeetsteak:

peterpanswendy:

captain-price-unofficially:

albierio:

berix2010:

mere-technicality:

r0zeclawz:

cottoncandylesbo:

cognitohazardous:

bluehawk54:

k8-says——-hi:

cyborgbuddy:

creepycrag:

beatrixisakitty:

desktop-assistant:

problemnyatic:

cheetahgirlmuscles:

loop-zoop:

communist-ojou-sama:

molsno:

autolenaphilia:

onvochn:

xlugger:

gentaroukisaragi:

thegreathomestuckreread:

switch:

possessedscholar:

coldgoldlazarus:

balmungkriemhild:

wavepriisms:

jayemadeablog:

f3mboyfucker:

metalgearsolidyaoi:

honeysider:

pc-98s:

gen4:

Windows 7 is the next release of Microsoft Windows, an operating system produced by Microsoft.

Windows 7 is intended to be an incremental upgrade with the goal of being fully compatible with existing device drivers, applications, and hardware

can’t wait for this

Windows 7

armor-piercing-cumshot:

demilypyro:

biggreenplanet:

amyashen:

cabybapa:

coreytasticc:

super-metroid:

metalbatteryzone:

kittycatblast:

charredasperity:

pissvortex:

constructreese:

cowwatcher:

nyxwoven:

thetabbybadger:

thatchickatthecomicbookstore:

sometimesthebestteacupischipped:

meashre:

nadhie:

itsrapsodia:

chillyfeetsteak:

peterpanswendy:

captain-price-unofficially:

albierio:

berix2010:

mere-technicality:

r0zeclawz:

cottoncandylesbo:

cognitohazardous:

bluehawk54:

k8-says——-hi:

cyborgbuddy:

creepycrag:

beatrixisakitty:

desktop-assistant:

problemnyatic:

cheetahgirlmuscles:

loop-zoop:

communist-ojou-sama:

molsno:

autolenaphilia:

onvochn:

xlugger:

gentaroukisaragi:

thegreathomestuckreread:

switch:

possessedscholar:

coldgoldlazarus:

balmungkriemhild:

wavepriisms:

jayemadeablog:

f3mboyfucker:

metalgearsolidyaoi:

honeysider:

pc-98s:

gen4:

Windows 7 is the next release of Microsoft Windows, an operating system produced by Microsoft.

Windows 7 is intended to be an incremental upgrade with the goal of being fully compatible with existing device drivers, applications, and hardware

can’t wait for this

herpsandbirds:

Long tailed Broadbill (Psarisomus dalhousiae), family Eurylaimidae, order Passeriformes, Hong Kong

photograph by CL Lee

sapphoandvanzetti:

transgang-deactivated20201105:

you know even if a homeless person or a starving person is in that position because of their own “bad decisions” i don’t care. it doesn’t matter. no supposed financial misstep is enough to condemn someone to homelessness or poverty.

Search and rescue teams do not ask if a hiker was properly equipped and prepared before they go out to look for them.

EMTs do not ask if a driver checked their mirrors before they take them to the hospital.

Lifeguards do not ask if a swimmer made a mistake by going into a riptide before they dive in after them.

Judgement doesn’t help anyone, including you, the person doing the judging. Just help people. Just shut the fuck up and help people.

alleesaur:

giraffe demon

alleesaur:

giraffe demon

whatcoloristhatcat:

blignick:

black tortoiseshell with moderate white spotting (calico), black with high white spotting

catgirl-catboy:

I think polyamory should be used to make romance subplots even more convoluted, actually. Sure, it could theoretically solve a love triangle- but it could also make things infinitely more chaotic and that’s way better.

catgirl-catboy:

I think polyamory should be used to make romance subplots even more convoluted, actually. Sure, it could theoretically solve a love triangle- but it could also make things infinitely more chaotic and that’s way better.

catgirl-catboy:

I think polyamory should be used to make romance subplots even more convoluted, actually. Sure, it could theoretically solve a love triangle- but it could also make things infinitely more chaotic and that’s way better.

vomitpartyart:

comic about slop

vomitpartyart:

comic about slop

times-chu:

sunshinecassette:

sillayangel:

fricktic:

made some versions of the agony grip for my friends for when the whole gang gets it . including different levels depending on the anguish

and a joyous one for when there is love abound

can i make a contribution?

for when the whole gang is being real autistic about something

For when you say something absolutely horrid in the group chat

Three blind mice.

xilloz:

Hi Tumblr

iggykoopa666:

“some of you are miserable because you’re mean” post except its “some of you are miserable because you only look at and talk about stuff you dislike”

nick-nonya:

secondimpact:

cannibalchicken:

teobug:

escuerzoresucitado:

You get there and all the pretty boys look up from drinking from the reservoir and gallop away like gazelle

babyanimalgifs:

Lion kneading/making biscuits

(Source)

mimihime:

radiation:

radiation:

What shall i draw with my tools

Pharrell

It Mightt Seem Crazy What Im Bout To Say..

spockandawe:

spockandawe:

Eight by fourteen inches equals roughly 22,000 individual stitches and just as many reasons to regret my poor life choices, but it’s DONE! I don’t want to think about how many hours of work went into this, but I’m so pleased with how it came out. I’ve had this in the works for about a year now, working on it here and there, but the last few weeks I’ve really pushed to finish it up, and I’m super happy with how this came out.

Aww, this got reblogged recently and has been getting notes, and oh man, what a nostalgic blast from the past!! This ended up being the only complete homestuck cross-stitch I ever did (I haven’t given up on sbahj, but i think i need to redo the pattern, which. sobbing.) but this was the start of a great cross-stitch era in my life! I’ve recently picked it back up, and am literally like, no more than a day or two from showing off a new complete project, so this is incredibly timely, haha.

I really recommend this as a hobby to anyone who needs something creative to keep their hands busy! I don’t think the timing of my first cross-stitch era and my second are coincidental, it’s meditative and sooothing, and allows me to continue MAKING THINGS without being demanding in terms of creative initiative. I don’t know what i want to draw or write, but yes, I can continue stitching pixel after pixel of color. And I’m MUCH faster now than I was then, a year of work is definitely not necessary, that’s a project that fits into a month or less 😂

beetledrink:

i hate it when people are writing a long ass thing and start a parenthetical aside and forget to close parentheses it makes me feel like i cant escape from the sentence

charl0ttan:

its soooooooooooooooooooooooooooo wild how MAD people get about art. what counts as art. what counts as Good art. what art shouldnt exist. what art is useless. so whattttt if its too avant garde or too simple or you personally dont agree with it. its art and we have to keep doing as much of it as we can get away with every day & if you want to stop it or get mad about it youre missing something huge as fuckkkkkkkkkkkk about how to live

byjove:

I love movies where the plot takes place in less than a day. It’s like. What if these people were experiencing the worst 8 hours of the entire lives and you got to see the highlight reel?

ask-ciaphas-cain:

Why WOULD I pay for tumblr premium when I get quality ads like this???

m-has-a-blog:

Lovely sentiment but the way it’s worded sounds like this dude got fucking killed during a little league game

fraktalbrokkoli:

Okay this comment I just came across deserves its own post. Imagine showing this to someone who’s never used Tumblr and explaining why haikus are relevant in this context.

Screenshot of a tumblr comment by user incubus-absolution: hm. note to self: make sure from now on that my hornyposts are not accidental haikus.ALT

red-velvet-0w0:

L + ratio + factories churn + bodies burn + stars are shining bright + its your turn + now you learn + how king cole feasts tonight

sharkdogsomething-deactivated20:

teaboot:

I don’t know why this keeps happening but I keep meeting toxic heterosexual couples who experiment with polyamory and are heavily into funko pops, board games, Disney princesses and Burlesque stripping and the man is always a withdrawn bearded dude and the woman is always a passive aggressive control freak with an Etsy shop that sells lawn gnomes styled after Dr Who characters and they don’t really even seem to like each other but they’re always exactly the same. this has happened four times

@niceferatu

junewild:

junewild:

junewild:

socks are the primary producers of the laundry biome. they typically mate for life and come in a wide variety of patterns, though—unlike shoes, which many theorize to be a symbiotic species—they lack sexual dimorphism. juvenile socks resemble their parents, but have yet to develop the long necks that distinguish socks from other species of the extremity family, such as mittens

the lint trap is a fascinating example of a decomposer. it relies on the environment to bring food in the form of detritus, which it then breaks down into lint. lint traps have relatively long lives in comparison with other species (especially given the recent downward trends in lifespan, which are likely caused by a combination of genetic bottlenecks and poor nutrition). the lint trap has an unusual relationship with fire—some theorize that it uses fire as a tool to increase resource availability, while others believe that its frequent proximity to fire is due to environmental factors

the apex predator of the laundry biome is, of course, the dreaded duvet cover. duvet covers lead solitary lives, and are rarely seen socializing with one another. its preferred prey is socks, although it is an opportunistic eater and will prey upon much larger targets, such as t-shirts, leggings, and even sheets. aside from its large territory and antisocial nature, its behaviors are poorly known and highly controversial. one major theory is that the duvet cover is an ambush predator, lying in wait for its prey. another is that the duvet cover seeks out prey, using its superior size and large mouth to overwhelm its victims in a matter of seconds. a third, less popular supposition is that the duvet cover lures its victims to it by mimicking the laundry bag, a preferred shelter for many residents of the laundry biome. more research on this topic is necessary

louisegluckpdf:

i remember learning the word melancholy at age 7 or something and thinking oh this word’s gonna be huge for me

2003-playground:

Can we stop using “still lives with their parents” or “unemployed” or “doesn’t have a drivers license” or “didn’t graduate high school” as an insult or evidence that someone is a bad person? Struggling with independence or meeting milestones is not a moral failing.

alllgator-blood:

I WAS SUPPOSED TO POST THIS YESTERDAY AND GOT SO SIDETRACKED. For that asker that requested a Spare Crumb of Goat Content…..this is the first finished art I made of the goat, from months ago. I actually drew the goat the same night unholy alliance was announced, but that was during a hiatus so I never ended up posting it.

ANYWAY THIS IS THE PERFECT TIME TO TELL A LESHY KNUCKLEBONES STORY THAT HAPPENED AFTER I DREW THIS. I had him make an egg despite my firm belief he’s an aroace king, I just wanted one baby from each bishop, and this motherfucker happened to be playing knucklebones when I summoned him to the mating tent. SO INSTEAD OF DOING THE FLIRTY ANIMATION HE WAS JUST THROWING FUCKING DICE AT AMDUSIAS?? AND IT WORKED????

deafmic:

immaplatypus:

deafmic:

deafmic:

today i found out that clip studio paint is customizable enough that i can turn the UI interface into paint tool sai and that’s great for me, the person who got so attached to paint tool sai version 1 that i have not been willing to learn another art program in 10 years (which is extremely inconvenient)

we have paint tool sai at home

paint tool sai at home:

FOR THE LOVE OF GOD PLEASE EXPLAIN HOW TO DO THIS. I’M LITERALLY IN THE SAME BOAT AS YOU, I HAVE BOTH PROGRAMS BUT I’M SO IN LOVE WITH SAI THAT I DON’T WANT TO LEAVE IT. clip studio paint seems DIVINE but i’m so scared of change

it’s super late at night so sorry in advance if i don’t explain this well but basically, almost everything in CSP’s UI can be changed and customized and you can save those configurations too

this is the default interface:


every single one of those panels pops out are can be stacked. positioned, and resized as much as you want.

here’s a rough markup of everything that can be changed (wrt existing components, you can add more)


the components mostly operate in a click and drag way. you grab the little tab at the top and stick it to wherever you want. you can stack them by dragging them above the other component and you can also group them in a tab by dragging one into the other. resizing is similar; you just drag the edge of the component (idk what these things are actually called so I’m using terminology from my job lol) to wherever you want and it’ll scale itself. it’s actually surprisingly easy to work with imo

for some of the other components like the tool bars, you’ll have to go to the window menu at the top of the program. this gives you the options to mess with those tool/command bars but it also lets you add and delete (hide, technically bc it keeps the settings) any component you want and there’s components for literally EVERYTHING in csp. it’s just a switch you flip in the menu that allows you to add/remove it.


it’s really really easy to get it set up to your work style!! i have a set of tools and hotkeys that i was able to replicate by doing this and it wasn’t very hard to set it up how i wanted

btw i don’t know if you paint but the main reason i waited so long is because sai’s painting engine is so good that no other program can come close, but csp has some really interesting brush and color blending options that you can customize to each brush and I’ve been able to get pretty damn close to the sai experience. it can be hard to get used to though and I’m still trying to adapt. I’d recommend messing around with the brushes with their settings box open because csp lets you test out changes as you’re making them so while testing is a pain in the ass that does matter it easier lol

sacred-portal:

chopnuts:

so-much-for-subtlety:

mathew:

SHE THINKS HER LAUGH IS A SONG SO SHE SINGS BACK