>>/62219/
My days of being responsible for corporate servers are long past as well and I ain't going back.
Faced with a similar situation, here is a rough outline of what they would need to be doing.
0. Do not touch the backups. You have a lot of work to do before you touch the backups.
1. The hardware itself. Firmware and the bios are no longer secure. They can be hacked and replaced. Reset everything back to factory defaults and reupdate.
2. Maybe they don't use a physical server? A virtual environment emulation is better, but I'd recreate that from scratch as well. In this case the vendor hardware is not your problem, but it should be fine even in this situation.
3. Install and update to the most current operating system from scratch. Not from backup. From scratch.
4. Install and update to the most current application software. Not from backup. From scratch.
5. Are the configuration files in binary or text?
6. If they are in binary, you use whatever configuration interface created them to recreate the files. Not from backup. From scratch. You may need a completely separate offline sacrificial test server to restore the binary config files to just to have a look at what hell is going on.
7. If they are in text, copy them from the backups to a neutral location. Examine each line, make sure you understand each, and copy over each line over to the correct configuration file. One line at a time.
8. These legacy configurations are probably not completely compatible with the current versions of the OS and applications. You'll need to tweak and test them.
9. And now you need to read up on what else has changed with the new versions of everything. Study. Learn. Tweak some more.
9. All that shit was easy. Once you have all the basics working together as expected you get into the really fun stuff.
10. Copy the supporting application scripts (yotsuba.php, etc.) to a neutral location and examine them. Line by line. Once you are satisfied there were no shenanigans inserted put them in place. Add some test data.
11. Watch everything hilariously fail. The supporting application scripts are legacy code that are not compatible with current standards. Study. Learn. Tweak some more. This step may be the worst nightmare of the bunch.
12. Everything working with test data? Good. Now, upload the backup database data to a neutral location and scan it all for malware and viruses. This is going to take a long fucking time. I'd seriously consider skipping this step! Make an announcement that the hackers encrypted all the data and are holding it for ransom. Yes, that's a lie. But, given all the extensive non-affiliated offsite archives? Fuck it. Start from scratch. Anyway, I suspect this is the stage 4chan is in now.
13. Contact the mods, janitors, etc. Do a test launch restricted to them to play around with after they have changed all their passwords and contact information.
14. After a day or two reopen to the public.
15. No idea how to handle the paid pass situation. At a minimum you want to restrict it for each user until they change their credentials. That may not matter if the information is already out there.
16. Good fucking luck!
And this is just an outline. There's a lot of assumed etc's in there. Like fixing the original vulnerability with the file processor/thumbnailer.