The Gold Network: Soyworld | SNCApedia | SoyPlace
As you might already know, archive.today (also known as archive.is, archive.ph and a variety of other mirror domains) is being investigated by the FBI and might be taken down soon. Please rearchive any archive.today link you find on SNCApedia on Megalodon.jp, then add said Megalodon.jp links to the articles. Faggot (talk) 23:14, 6 November 2025 (UTC)
Russian 'P poster
You can help by notifying the authorities, and also by ordering 'za to xer house
Pedo-meter 3000: ▲
On imageboards, an obsessed Central Asian(?) pedonigger subhuman would ocassionally advertise a shortened link to 'P files, accompanied with a 'P image and a description of extremely degenerate antimatter. On active imageboards, it is immediately banned upon sight. Investigation by a swinnycuck showed that this bot targets imageboards listed on an imageboard directory website and that the subhuman running the bot has been doing it for 4 years.[1][2] I'll find the post later but a non-soyosphere altchan owner had a thread on his site talking about a bot using Russian IPs to advertise CSAM since around 2020 on various altchans, presumably these are run by the same person.
This pedonigger had reportedly tried to spam Kiwi Farms and also sold 'P. You WILL dox and kill this nigger.
Here's a shitty system i just made for the chaddy to prevent this bot, feel free to use it.
the fingerprint of the "bot" is 100% real and it solves captchas, might not even be a bot and just an obsessed faggot. Also it seems to target every altchan even the ones not listed on imageboards.net
checkPostShorteners: async (options) => {
const { proxy } = config.get;
// move this to config later nusoigoogagaga
const blacklistedKeywords = config.get.blacklistedKeywords || [
"pomf2.lain.la",
"small previews:",
];
const post = await Posts.getPost(
options.board._id || options.board,
options.postId,
true
);
const siteLinks = post.nomarkup.match(
/\b(?:https?:\/\/|www\.)[^\s<>"']+/gi
);
if (!siteLinks || siteLinks.length === 0) {
return;
}
// maybe only do duplicate domains?
const uniqueLinks = [...new Set(siteLinks)].slice(0, 10);
console.log("Found links:", uniqueLinks);
const agent = proxy.enabled
? new SocksProxyAgent(require("url").parse(proxy.address))
: null;
const badSiteFound = await Promise.any(
uniqueLinks.map(async (url) => {
try {
const response = await fetch(url, {
timeout: 10000,
agent,
redirect: "follow",
headers: {
"User-Agent": `Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:146.0) Gecko/20100101 Firefox/146.0`,
},
});
const text = await response.text();
if (
blacklistedKeywords.some((keyword) =>
text.includes(keyword)
)
) {
console.log(`${url} matched blacklisted keywords`);
return true;
}
} catch (e) {
console.warn(`Error fetching ${url}:`, e.message);
}
return false;
})
);