FoolFuuka

From Bibliotheca Anonoma
FoolFuuka-and-Asagi.png

FoolFuuka is a PHP web viewer paired with the Asagi archiver (by Eksopl). It was developed by the FoolCode team who originally made this fork for archive.foolz.us. Currently, it is maintained by 4plebs and the Bibliotheca Anonoma.

Source Code[edit]

Since the main developers of Asagi (Eksopl) and FoolFuuka (FoolCode) have stepped away from anything more than maintenance development ever since they stopped operating 4chan Archivers, these two forks are the mainline development repositories for the FoolFuuka/Asagi stack in a multiorganizational collaboration between 4plebs and the Bibliotheca Anonoma.

  • pleebe/FoolFuuka - The 4plebs fork of FoolFuuka is the mainline fork used by 4plebs, Desuarchive, RebeccaBlackTech, and many other archivers.
  • bibanon/asagi - The bibanon fork of Asagi is the mainline fork used by Desuarchive, RebeccaBlackTech, and Archived.moe.

Plugins[edit]

Future FoolFuuka/Asagi Replacement[edit]

These components are pieces of a new archiver stack replacing the dated FoolFuuka/Asagi stack by a multiorganizational effort to build a new Ayase Archival Standard, which could possibly even be a multisite system.

  • bibanon/eve - [4chan Scraper] (Stability testing) Desuarchive currently uses the Python-based Eve scraper in experiments to archive the troublesome board wsg, and eventually gif.
  • pleebe/4plebs-X [4chan API Compatible Client-side Webapp] - (In development) 4plebs developed a new client-side Javascript interface (based on 4chan-X code) for the FoolFuuka/Asagi stack by adding 4chan API compatibility to FoolFuuka. Public preview available at test.4plebs.org . Requires the plugin pleebe/foolfuuka-plugin-fourchan-api below.
  • pleebe/foolfuuka-plugin-fourchan-api [4chan API for FoolFuuka] - (In development) 4chan API Compatibility plugin for FoolFuuka.
  • A modified fork of clover that could use the 4chan API could be used as a mobile frontend. We may also try to find something that can be deployed using Qt or ncurses.

Installation[edit]

Administration[edit]

Architecture[edit]

An advanced section for setting up the architecture needed to run FoolFuuka/Asagi/SphinxSearch at scale.

  • FoolFuuka/Nginx
  • FoolFuuka/MySQL - We use Percona MariaDB, a version of MySQL. It is also tokudb.
  • FoolFuuka/Sphinx/Advanced - At Desuarchive the search server serves both Desuarchive and Rbt.asia, and is hosted on a server seperate from the frontend, database, and scraper with OpenVPN connecting the two.

Backups[edit]

FoolFuuka/Asagi is notorious for their large and unwieldy database. We use this command to do daily and weekly dumps without getting locking problems.

mysqldump -u $DB_user -h $DB_host -p$DB_pass --opt --single-transaction --quick --lock-tables=false $DB $t | gzip > $DIR/$DB-$t.sql.gz

Dumping FoolFuuka Tables without leaking IPs or Passwords[edit]

CSV[edit]

Don't just release the above SQL dump into the wild. This script will dump to CSV, which is actually quite easy to import back.

maybe a good idea to drop the doc_id and media_id columns as 4plebs does https://archive.org/details/4plebs-org-data-dump-2018-01

SELECT `doc_id`, `media_id`, `num`, `subnum`, `thread_num`, `op`, `timestamp`, `timestamp_expired`, `preview_orig`, `preview_w`, `preview_h`, `media_filename`, `media_w`, `media_h`, `media_size`, `media_hash`, `media_orig`, `spoiler`, `deleted`, `capcode`, `email`, `name`, `trip`, `title`, `comment`, `sticky`, `locked`, `poster_hash`, `poster_country`, `exif`
FROM `pol`
INTO OUTFILE "/tmp/a.csv"
  FIELDS TERMINATED BY ',' ENCLOSED BY '"'
  LINES TERMINATED BY "\n";

SQL[edit]

mysqldump doesn't support dropping columns from a dump, so it's not usable for making public .sql dumps. One way to do this is with https://github.com/bbepis/mysqlchump. Example command:

mysqlchump --table a --connectionString "Server=127.0.0.1;Database=asagi;User Id=backup;Password=backuppassword;" --select 'SELECT `doc_id`, `media_id`, `num`, `subnum`, `thread_num`, `op`, `timestamp`, `timestamp_expired`, `preview_orig`, `preview_w`, `preview_h`, `media_filename`, `media_w`, `media_h`, `media_size`, `media_hash`, `media_orig`, `spoiler`, `deleted`, `capcode`, `email`, `name`, `trip`, `title`, `comment`, `sticky`, `locked`, `poster_hash`, `poster_country`, `exif` FROM `a` ORDER BY `doc_id` ASC' a.sql

With some external tooling, you can also do incremental dumps by adding a WHERE clause to the select query with the last `doc_id` from your most recent dump.

Automatic Full SQL DB Backups[edit]

crontab -e

   0 3 * * * /bin/bash /home/atc/sql/full.sh

That script back up foolfuuka rbt_foolfuuka asagi

Iterates through each DB

/home/atc/sql/desu - Writes gzip’d SQL files here (normal to be empty)

Uploads to encrypted RClone folder

Automatic CSV Dumps as seen on Internet Archive[edit]

   php /home/atc/backup.php

Backupbot Database[edit]

``` CREATE DATABASE backupbot DEFAULT CHARACTER SET utf8mb4 DEFAULT COLLATE utf8mb4_general_ci;

 GRANT ALL PRIVILEGES ON `backupbot`. * TO 'backupbot'@'localhost';
 GRANT FILE ON *.* TO 'backupbot'@'localhost';
 GRANT SELECT ON `asagi`.* TO 'backupbot'@'localhost';
 GRANT SELECT ON `foolfuuka`.* TO 'backupbot'@'localhost';
 GRANT SELECT ON `rbt_foolfuuka`.* TO 'backupbot'@'localhost';
 flush privileges;
 use backupbot;
 create table backupstats (
     board VARCHAR(20) PRIMARY KEY,
     lastid INT
 );

```