Tuesday, December 11


Some People Have All The Leak

That memory leak came back and took the site down again.

Why didn't the monitoring system send me an email....  Oh, it did?  Like, an hour ago?  Ugh.  Never mind.

I've increased the server size to 16GB (way more than it really needs, once I track down the leak) and added a daily restart of the application, which will take about three seconds.  Three years of that will equal what just happened, so that's a reasonable tradeoff.  Actually, I think with 16GB of RAM it might not be possible for it to run out; even with the leak it should only ever reach 12GB.

We'll should be moving over to the new server at Christmas, which won't do this.

Posted by: Pixy Misa at 11:11 PM | Comments (6) | Add Comment | Trackbacks (Suck)
Post contains 125 words, total size 1 kb.

Monday, October 22


The Great Undespamming

It looks like about 62,000 comments from earlier this year got caught up in a post-hoc spam filter and vanished.  I've unvanished them, but that leaves us with a secondary problem that some of the comments that got caught by the spam filter were actually spam

And now they're back, and we'll have to de-spam them again.

Exclusive Video Footage of Comments Being Despammed and Undespammed Again

Posted by: Pixy Misa at 01:57 PM | Comments (2) | Add Comment | Trackbacks (Suck)
Post contains 70 words, total size 1 kb.

Sunday, September 16


Performance Glitches

I've noticed some performance hiccups on the site recently, and users might have as well.  I think it's tied to web spiders over-enthusiastically indexing our sites.  

I've updated our logging to try to trap the culprits; I'm not planning to throttle Google or Bing, but if it's Yandex or Baidu causing problems I will stomp on them.

Posted by: Pixy Misa at 12:34 AM | No Comments | Add Comment | Trackbacks (Suck)
Post contains 59 words, total size 1 kb.

Friday, September 07


Speaking Of Servers

I cancelled the server I ordered by mistake and the new server is up and running.  It has the same basic specs with two differences: Instead of 8 x 1TB disk drives on the old server (I misread the specs and thought it was SSDs) it has 6 x 2TB SSDs.  Really real SSDs this time; I've tested the array at 300,000 IOPS, the equivalent of 2500 regular 7200 RPM disk drives.

And instead of 200TB of monthly bandwidth, it's 1Gbit unmetered.  Which doesn't actually make much difference, because 200TB is close to saturating 1Gbit outbound and I don't do much inbound traffic.

Oh, and it's software RAID rather than hardware.

Reinstalling it now, configuring RAID-5 and LVM, so I can take consistent snapshots of the entire server without having to worry about managing clean database dumps of MySQL and MongoDB and Elasticsearch and and and...

Then I install KVM and LXC, then I start migrating systems across into their own neat little virtualised containers.

Update 1: Manually configured RAID-1 for boot and RAID-5 for LVM, splotted swap volumes everywhere, and installing Ubuntu 18.04.1 right now.  The auto-install script unhelpfully assigns 100% of the default volume group to / meaning you have no room left to take snapshots.  I hope I got it right, but at least the partitioning is right so it will be a lot easier if I have to reinstall again.

Update 2: The secret is to use the autoinstall to bring up the server quickly, then use fdisk to create your custom partitioning scheme, then use the Ubuntu expert install mode to install on those partitions.  Much much quicker than fiddling about with the partitioning tool in the installer.

Also, don't install to a huge software RAID-5 or 6 array, even on fast SSDs.  It takes at least five times longer than normal because the sync will be running the entire time.  Create a RAID-1 array for the OS and you'll be done that much faster.

Update 3: Whee!  That was so much faster.  Let's see if the network config works this time...

Update 4: Yep, that worked perfectly.

Update 5: Well, I messed that up slightly.  I think I'll just go with LXC here, and leave KVM alone.

Posted by: Pixy Misa at 06:12 PM | No Comments | Add Comment | Trackbacks (Suck)
Post contains 376 words, total size 3 kb.

Saturday, August 18


Found A New Editor!

I'll see if I can deploy it here for some live testing.  I will definitely use it for the new system.

It's called Redactor.  The other one I'm looking at is called Froala.  Froala has some neat tricks, but it's a bit on the expensive side.  Redactor is 60% cheaper, plus I bought a license for an earlier version years ago, so I can upgrade to the latest version for half price.

Redactor has fewer tricks but has a nice API for adding extensions, so I can tie it into my new API.  

For example, if you start typing an @name, I can look up your friends list and show you the matching names and when you select one, automatically turn it into a link.

Also: Autosave.

Update: Bought a Pro license for Redactor.  I got a Basic license all the way back in 2012 to try it out, but didn't switch at the time.  They've done a whole bunch of updates since then.

Posted by: Pixy Misa at 04:17 AM | No Comments | Add Comment | Trackbacks (Suck)
Post contains 168 words, total size 1 kb.

Saturday, August 11


Maintenance (Just A Bit)

Had to restart the application container due to a memory leak.  

It looked like an I/O error at first, because the first sign of problems was images refusing to load.  But that was because of the on-demand resizing Minx does, which is fairly memory-intensive.  There are no I/O errors in the server logs or the backup logs, so after a quick restart and some parameter tweaks (and an extra backup, but that's still running) we're back on the air.

Right now the application container  is using 400MB out of the assigned 24GB, so we should be fine while I poke around and find out what happened.  There is only one code change I've made since we moved, but that should use less memory, not more.  Hmm.  Maybe not...

Posted by: Pixy Misa at 02:58 PM | No Comments | Add Comment | Trackbacks (Suck)
Post contains 132 words, total size 1 kb.

Sunday, August 05


More Updates

Sorry, we had another database glitch just now. Fixed and working on the root cause.

Update: Aha! I screwed up the MySQL configuration directory when I was copying the old database server over, so it was running with an out-of-date config file. That led to the issue with not being able to post a few days ago, and the issue with running out of connections just now.

All is stable again.

Posted by: Pixy Misa at 12:41 AM | Comments (4) | Add Comment | Trackbacks (Suck)
Post contains 73 words, total size 1 kb.

Saturday, August 04


Maintenance This Weekend

Doing some more server maintenance this weekend to improve performance and shut off the old server. We might have some glitches, please bear with us.

Update: Found and fixed a couple of slow queries, and built a new index, which affected posting again but only for 12 minutes this time.

Update: We are switching servers right now! This should simply happen, but while it is happening images won't load on this site. They will load on your site, just not on this site, because I'm testing new features here. That should resolve itself soon.

Update: Waiting for DNS changes to propagate to complete the switch. I can already see the load spooling up on the new server and dying down on the old one as more people pick up the DNS change. It's kind of neat.

Update: Images are now loading here. We have normality. I repeat, we have normality. Anything you still can't cope with is therefore your own problem.

Update: Also, performance should be back to normal:
76kb generated in CPU 0.01, elapsed 0.0159 seconds.
14 queries taking 0.0109 seconds, 25 records returned.

Update: And I accidentally shut down the old server while running final backups, but at least that proved that the new server was working...
This weekend's updates brought to you by K-On!!!

Posted by: Pixy Misa at 05:04 PM | No Comments | Add Comment | Trackbacks (Suck)
Post contains 219 words, total size 2 kb.

Monday, July 30


Ongoing Maintenance

So, slight problem with last night's update - there was a glitch with the config file on the new database server which meant that no-one could post new blog entries.  Comments worked fine, but you couldn't post new entries.*

And I was in a meeting all day today without access to the server, so I only just got a chance to find the problem and fix it.

Sorry about that.

Also, creating new posts is weirdly slow.  Other things seem fine.  Updating posts is fine.

I'll get that fixed tomorrow.

Uploading images is also fixed.

*  The config file was correct, but somehow the server was ignoring a critical setting, which meant some specific queries just didn't work.

Posted by: Pixy Misa at 07:18 PM | Comments (1) | Add Comment | Trackbacks (Suck)
Post contains 120 words, total size 1 kb.

Saturday, December 23


So, What Happened There...

I was planning to do a server migration last weekend, but the new server died.

So it got pushed back until basically the last minute before the old server was set to get cancelled on Thursday - and then on Thursday Akismet, the spam filtering service provided by Wordpress, basically dropped dead.  Don't know what's up with that, but it took out not only all commenting on and, but also the old server itself due to a huge backlog of requests (it was taking forever for Akismet to eventually time out).

This happened while I was migrating the server, and since everything was in flux I had no idea what, specifically, had gone wrong.

The remaining issues were due to an upgrade from MySQL 5.0 to 5.7, which by default runs in strict mode and didn't like some old, sloppy SQL queries and definitions; a couple of places where I'd specified canonical server names ( instead of functional names (, and a proxy configuration error that meant the real IP address wasn't getting passed through to Minx.

And that meant, because Akismet is rejecting all my API requests, that every commenter was treated as having the same IP address, which was banned.


Working on the recent comments and search functions now.  I had to disable recent comments because MySQL 5.7 can't properly evaluate the query (it takes a minute and a half for something that used to take milliseconds).  I'm not sure yet what happened to search.

On a related note, the MySQL query optimiser doesn't seem to improve over time.  They just move the bugs around.

Update: I think Akismet didn't actually go down, they just changed something that we were depending on.  They probably sent me an email six months ago...

Posted by: Pixy Misa at 07:19 PM | No Comments | Add Comment | Trackbacks (Suck)
Post contains 300 words, total size 2 kb.

<< Page 1 of 41 >>
57kb generated in CPU 0.05, elapsed 0.2305 seconds.
54 queries taking 0.1909 seconds, 196 records returned.
Powered by Minx 1.1.6c-pink.
Using http / / 194