Close
Why I get 100 KB/s download speed from this site again?
Anybody? it's just me? some days the speed of this site is just abysmal low.
I get 60 KB/s XD, well I don't know my normal download speed from yandre
1MiB/s to 2MiB/s @ 1Gbit connection.
bit slow but no problem with browsing and downloading all the images.
Same for me: my connection is circa 35 Mbps for DL, and tonight, Yande.re is really slow to load images after I have edited tags (the first load is as fast as expected, though). I thought that it was fixed, since the last time.
The storage server use ZFS, so there's a monthly Scrub running. When that happen, it will be kinda slow for everyone for a few days. Scrubbing is a monthly process that data storage run to make sure all files checksums are ok. (and if not, automatically heal those)
Checkmate said:
The storage server use ZFS, so there's a monthly Scrub running. When that happen, it will be kinda slow for everyone for a few days. Scrubbing is a monthly process that data storage use to make sure all files checksums are ok. (and if not, automatically heal those)
OK. Now, we’ll know that it is normal, for the next times it will happen.

Thanks for the explanation!
Checkmate said:
The storage server use ZFS, so there's a monthly Scrub running. When that happen, it will be kinda slow for everyone for a few days. Scrubbing is a monthly process that data storage use to make sure all files checksums are ok. (and if not, automatically heal those)
Oh, no wonder. I was worried yande.re had been DDOS-ed.

Thanks for explaining it to us. ^.^b
Checkmate said:
The storage server use ZFS, so there's a monthly Scrub running. When that happen, it will be kinda slow for everyone for a few days. Scrubbing is a monthly process that data storage use to make sure all files checksums are ok. (and if not, automatically heal those)
Thanks for your reply,I encountered another problem,I downloaded the new images in front of the website can get a fast download speed,But the download speed of the old images after 5000 pages is only 1Mbps.Is it because of caching?
I want to add some extra caveats:

I did what I can to reduce the impact, so there's a small cached disk in front that is separated from Scrubbing. Most popular posts and images will be cached for fast access. However, when the cached are filled, those that are viewed the least will be removed from cached drive.

That's why some images are fast, some are slow. After you load the slow one, it became fast for everyone else. This only applies to image assets, be it thumbnails, images, and full sizes. It does not apply to Pool Zips.

I plan to increase the cached drive a bit eventually.

This will answer most of your questions.
Could automatically cache all recent images(posted up to 7 days ago) also? that would make the most sense since I suppose that's where most of the traffic goes.
Right now, I only have around 300GB of SSD to cache image, they got filled up in a day. Given the usage, I'd need 1TB of SSD to cache 3-5 days' worth of "recent images". 2TB for 7 days. I'm not planning 2TB but 1TB is doable.

The system will cache an image if it is being requested 3 times in the span of one minute.
I download about 1/3 of all new images of the site every day and the total size is usually around 300/400 to 600 MB of daily new images, I don't get how that alone fills a 300 GB SSD but okay o_O
I'm sure we have tons of people like you because I'm blowing past 10TB/day for yande.re bandwidth alone.

0.3TB caching out of 10TB/day serving is a small amount of caching necessary for a tenth of it, let alone all.
Checkmate said:
I'm sure we have tons of people like you because I'm blowing past 10TB/day for yande.re bandwidth alone.

0.3TB caching out of 10TB/day serving is a small amount of caching necessary for a quarter of it, let alone all.
I guess that at least half of the server’s daily 10T bandwidth is caused by crawlers,If you follow the original caching method,The large number of requests caused by crawlers will cause many incorrect caches,The current crawler has two ways to get pictures,one is to capture images of the entire site from beginning to end, and the other is to capture images based on image tags,I recommend using 300GB SSD to store the newest images,If you can connect a 1T ordinary disk to the server, you can use it as a cache for old images (images other than 300GB SSD).
I plan to make available around 1TB of fast HDD array to cache both jpg and png assets whereas the 300GB NVMe is to cache the thumbnails.
The new 1TB cache is now in up and running.
Checkmate said:
The new 1TB cache is now in up and running.
All hail our overlord. ^.^

Thanks for the hardwork.
In just less than two days, the 1TB array is filled to 750GB. I guess it will be filled up on the 3rd as I calculated.
...and I'm back to sub-zero download speeds again, rip.

I just took four minutes to download a single 4,81 MB image o.o, the average speed is like around 5/10 KB/s

Edit(about 1 hour later): Suddently now it start working again with normal speeds
Marona762 said:
...and I'm back to sub-zero download speeds again, rip.

I just took four minutes to download a single 4,81 MB image o.o, the average speed is like around 5/10 KB/s

Edit(about 1 hour later): Suddently now it start working again with normal speeds
Still not enough, especially during the attacks to yande.re currently happening.

Maybe, apart from the cache, the bandwidth connecting to the servers is also one of the problems?
Possibly congestion, given most of the western world is entering lockdown again (My lockdown was never lifted since March, still ongoing). I'm not going to expect the internet to be smooth.

Sorry, but you got to live with how it is now.

Edit: it could be something else, I'm looking into it.
It could be some abusive bots like the one messing with the images tags / source today congesting the bandwidth?
Marona762 said:
It could be some abusive bots like the one messing with the images tags / source today congesting the bandwidth?
It's possible.

But even without attacks, when there're large amounts of uploads during the same time frame (e.g. during that "Pixiv ruah hours"), connection speed will also get lowered significantly.
There's an abnormal IO problem on the array that I am looking into troubleshooting currently. The site will continue to perform in a degraded state.

I may take it down occasionally for a few moments each time to troubleshoot it.
The array is now effectively slower than my SD Card, I suspended it temporarily.

It is still slower than my SD Card even if it sits idle.

# dd if=/dev/zero of=test bs=64k count=16k conv=fdatasync; rm test
16384+0 records in
16384+0 records out
1073741824 bytes (1.1 GB, 1.0 GiB) copied, 239.832 s, 4.5 MB/s
Checkmate said:
The array is now effectively slower than my SD Card, I suspended it temporarily.

It is still slower than my SD Card even if it sits idle.
Now the site loads super fast (for now - not sure how well it will perform during Pixiv rush hours, then).
I removed the array from production usage, pending further troubleshooting.
Checkmate said:
I removed the array from production usage, pending further troubleshooting.
Yande.re Will API be banned? I accessd the API of the website and got 404 error,In that case, some third-party clients may not be available,But it seems that the loading speed of the website is getting faster now.
Can you elaborate which API url that got 404 error?
Checkmate said:
Can you elaborate which API url that got 404 error?
https://yande.re/post.xml?limit=1
Really slow again. Am I the only one? I know that my ISP (SFR, France) has some problems on the evening (CET timezone) since Monday, but no other site is so much affected. Is it that monthly scrubbing operation again, Checkmate?