FoolFuuka: Difference between revisions
Antonizoon (talk | contribs) No edit summary |
Antonizoon (talk | contribs) |
||
(28 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
'''FoolFuuka''' is a PHP web viewer paired with the Asagi archiver. It was developed by the FoolCode team who originally made this fork for archive.foolz.us. Currently, it is maintained by 4plebs. | [[File:FoolFuuka-and-Asagi.png|thumb]] | ||
'''FoolFuuka''' is a PHP web viewer paired with the Asagi archiver (by Eksopl). It was developed by the FoolCode team who originally made this fork for archive.foolz.us. Currently, it is maintained by 4plebs and the Bibliotheca Anonoma. | |||
== Source Code == | |||
Since the main developers of Asagi (Eksopl) and [https://github.com/FoolCode/FoolFuuka FoolFuuka (FoolCode)] have stepped away from anything more than maintenance development ever since they stopped operating 4chan Archivers, these two forks are the mainline development repositories for the FoolFuuka/Asagi stack in a multiorganizational collaboration between 4plebs and the Bibliotheca Anonoma. | |||
* [https://github.com/pleebe/FoolFuuka pleebe/FoolFuuka] - The 4plebs fork of FoolFuuka is the mainline fork used by 4plebs, Desuarchive, RebeccaBlackTech, and many other archivers. | |||
* [https://github.com/bibanon/asagi bibanon/asagi] - The bibanon fork of Asagi is the mainline fork used by Desuarchive, RebeccaBlackTech, and Archived.moe. | |||
** [[FoolFuuka/Asagi|Asagi Design Document, WIP]] | |||
=== Plugins === | |||
* [https://github.com/pleebe/foolfuuka-plugin-intel-share foolfuuka-plugin-intel-share] - 4chan archiver banned hash sharing plugin | |||
* [https://github.com/desuarchive/foolfuuka-plugin-ban-logging desuarchive/foolfuuka-plugin-ban-logging] - 4chan ban logging plugin for FoolFuuka. | |||
=== Future FoolFuuka/Asagi Replacement === | |||
These components are pieces of a new archiver stack replacing the dated FoolFuuka/Asagi stack by a multiorganizational effort to build a new [[Ayase]] Archival Standard, which could possibly even be a multisite system. | |||
* [https://github.com/bibanon/eve bibanon/eve] - [4chan Scraper] (Stability testing) Desuarchive currently uses the Python-based Eve scraper in experiments to archive the troublesome board wsg, and eventually gif. | |||
* [https://github.com/pleebe/4plebs-X pleebe/4plebs-X] [4chan API Compatible Client-side Webapp] - (In development) 4plebs developed a new client-side Javascript interface (based on 4chan-X code) for the FoolFuuka/Asagi stack by adding 4chan API compatibility to FoolFuuka. Public preview available at test.4plebs.org . Requires the plugin pleebe/foolfuuka-plugin-fourchan-api below. | |||
* [https://github.com/pleebe/foolfuuka-plugin-fourchan-api pleebe/foolfuuka-plugin-fourchan-api] [4chan API for FoolFuuka] - (In development) 4chan API Compatibility plugin for FoolFuuka. | |||
* A modified fork of clover that could use the 4chan API could be used as a mobile frontend. We may also try to find something that can be deployed using Qt or ncurses. | |||
== Installation == | == Installation == | ||
* [[FoolFuuka/Install/Debian|Installing on Debian Jessie]] | * [[FoolFuuka/Install/EL8|Installing on RHEL8/Rocky/Alma Linux 8]] - We prefer to use Rocky/Alma Linux 8 with SELinux and PHP 7.4 supported until 2030. | ||
<!-- * [[FoolFuuka/Install/Debian|Installing on Debian Jessie]] - This won't work --> | |||
* [[FoolFuuka/Install/Ubuntu16|Installing on Ubuntu 16.04 LTS]] | * [[FoolFuuka/Install/Ubuntu16|Installing on Ubuntu 16.04 LTS]] | ||
* [[FoolFuuka/Install/Docker|FoolFuuka/Ayase Stack with Docker Compose]] - Miracles are happening here with a Percona MariaDB, PHP7, Asagi, Sphinxsearch orchestrator. | |||
* [[FoolFuuka/Install/Windows|Installing on Windows, Apache, Mysql, PHP]] - Not recommended but it can work. | |||
== Administration == | |||
* [[FoolFuuka/Start]] - Start up Asagi, Sphinxsearch, and FoolFuuka. | |||
* [[FoolFuuka/Import]] - Import data from the dumps. | |||
* [[FoolFuuka/Housekeeping]] - You may be ordered by the provider, by the state, or by a plea from the person themselves to clean up data. Here's how. | |||
* [[FoolFuuka/Troubleshooting]] - Restart the site if its down or troubleshoot if there are problems. | |||
* [[FoolFuuka/Upgrading]] - Upgrade FoolFuuka to the latest version. | |||
* [[FoolFuuka/Plugins]] - Plugins and extensions you could use. | |||
== Architecture == | |||
An advanced section for setting up the architecture needed to run FoolFuuka/Asagi/SphinxSearch at scale. | |||
* [[FoolFuuka/Nginx]] | |||
* [[FoolFuuka/MySQL]] - We use Percona MariaDB, a version of MySQL. It is also tokudb. | |||
* [[FoolFuuka/Sphinx/Advanced]] - At Desuarchive the search server serves both Desuarchive and Rbt.asia, and is hosted on a server seperate from the frontend, database, and scraper with OpenVPN connecting the two. | |||
== Backups == | |||
FoolFuuka/Asagi is notorious for their large and unwieldy database. We use this command to do daily and weekly dumps without getting locking problems. | |||
<pre> | |||
mysqldump -u $DB_user -h $DB_host -p$DB_pass --opt --single-transaction --quick --lock-tables=false $DB $t | gzip > $DIR/$DB-$t.sql.gz | |||
</pre> | |||
=== Dumping FoolFuuka Tables without leaking IPs or Passwords === | |||
==== CSV ==== | |||
Don't just release the above SQL dump into the wild. This script will dump to CSV, which is actually quite easy to import back. | |||
maybe a good idea to drop the doc_id and media_id columns as 4plebs does https://archive.org/details/4plebs-org-data-dump-2018-01 | |||
{{bc|<nowiki> | |||
SELECT `doc_id`, `media_id`, `num`, `subnum`, `thread_num`, `op`, `timestamp`, `timestamp_expired`, `preview_orig`, `preview_w`, `preview_h`, `media_filename`, `media_w`, `media_h`, `media_size`, `media_hash`, `media_orig`, `spoiler`, `deleted`, `capcode`, `email`, `name`, `trip`, `title`, `comment`, `sticky`, `locked`, `poster_hash`, `poster_country`, `exif` | |||
FROM `pol` | |||
INTO OUTFILE "/tmp/a.csv" | |||
FIELDS TERMINATED BY ',' ENCLOSED BY '"' | |||
LINES TERMINATED BY "\n"; | |||
</nowiki>}} | |||
==== SQL ==== | |||
mysqldump doesn't support dropping columns from a dump, so it's not usable for making public .sql dumps. One way to do this is with https://github.com/bbepis/mysqlchump. Example command: | |||
<pre>mysqlchump --table a --connectionString "Server=127.0.0.1;Database=asagi;User Id=backup;Password=backuppassword;" --select 'SELECT `doc_id`, `media_id`, `num`, `subnum`, `thread_num`, `op`, `timestamp`, `timestamp_expired`, `preview_orig`, `preview_w`, `preview_h`, `media_filename`, `media_w`, `media_h`, `media_size`, `media_hash`, `media_orig`, `spoiler`, `deleted`, `capcode`, `email`, `name`, `trip`, `title`, `comment`, `sticky`, `locked`, `poster_hash`, `poster_country`, `exif` FROM `a` ORDER BY `doc_id` ASC' a.sql</pre> | |||
With some external tooling, you can also do incremental dumps by adding a WHERE clause to the select query with the last `doc_id` from your most recent dump. | |||
=== Automatic Full SQL DB Backups === | |||
crontab -e | |||
0 3 * * * /bin/bash /home/atc/sql/full.sh | |||
That script back up foolfuuka rbt_foolfuuka asagi | |||
Iterates through each DB | |||
/home/atc/sql/desu - Writes gzip’d SQL files here (normal to be empty) | |||
Uploads to encrypted RClone folder | |||
=== Automatic CSV Dumps as seen on Internet Archive === | |||
php /home/atc/backup.php | |||
==== Backupbot Database ==== | |||
``` | |||
CREATE DATABASE backupbot DEFAULT CHARACTER SET utf8mb4 DEFAULT COLLATE utf8mb4_general_ci; | |||
GRANT ALL PRIVILEGES ON `backupbot`. * TO 'backupbot'@'localhost'; | |||
GRANT FILE ON *.* TO 'backupbot'@'localhost'; | |||
GRANT SELECT ON `asagi`.* TO 'backupbot'@'localhost'; | |||
GRANT SELECT ON `foolfuuka`.* TO 'backupbot'@'localhost'; | |||
GRANT SELECT ON `rbt_foolfuuka`.* TO 'backupbot'@'localhost'; | |||
flush privileges; | |||
use backupbot; | |||
create table backupstats ( | |||
board VARCHAR(20) PRIMARY KEY, | |||
lastid INT | |||
); | |||
``` |
Latest revision as of 17:18, 26 November 2023
FoolFuuka is a PHP web viewer paired with the Asagi archiver (by Eksopl). It was developed by the FoolCode team who originally made this fork for archive.foolz.us. Currently, it is maintained by 4plebs and the Bibliotheca Anonoma.
Source Code[edit]
Since the main developers of Asagi (Eksopl) and FoolFuuka (FoolCode) have stepped away from anything more than maintenance development ever since they stopped operating 4chan Archivers, these two forks are the mainline development repositories for the FoolFuuka/Asagi stack in a multiorganizational collaboration between 4plebs and the Bibliotheca Anonoma.
- pleebe/FoolFuuka - The 4plebs fork of FoolFuuka is the mainline fork used by 4plebs, Desuarchive, RebeccaBlackTech, and many other archivers.
- bibanon/asagi - The bibanon fork of Asagi is the mainline fork used by Desuarchive, RebeccaBlackTech, and Archived.moe.
Plugins[edit]
- foolfuuka-plugin-intel-share - 4chan archiver banned hash sharing plugin
- desuarchive/foolfuuka-plugin-ban-logging - 4chan ban logging plugin for FoolFuuka.
Future FoolFuuka/Asagi Replacement[edit]
These components are pieces of a new archiver stack replacing the dated FoolFuuka/Asagi stack by a multiorganizational effort to build a new Ayase Archival Standard, which could possibly even be a multisite system.
- bibanon/eve - [4chan Scraper] (Stability testing) Desuarchive currently uses the Python-based Eve scraper in experiments to archive the troublesome board wsg, and eventually gif.
- pleebe/4plebs-X [4chan API Compatible Client-side Webapp] - (In development) 4plebs developed a new client-side Javascript interface (based on 4chan-X code) for the FoolFuuka/Asagi stack by adding 4chan API compatibility to FoolFuuka. Public preview available at test.4plebs.org . Requires the plugin pleebe/foolfuuka-plugin-fourchan-api below.
- pleebe/foolfuuka-plugin-fourchan-api [4chan API for FoolFuuka] - (In development) 4chan API Compatibility plugin for FoolFuuka.
- A modified fork of clover that could use the 4chan API could be used as a mobile frontend. We may also try to find something that can be deployed using Qt or ncurses.
Installation[edit]
- Installing on RHEL8/Rocky/Alma Linux 8 - We prefer to use Rocky/Alma Linux 8 with SELinux and PHP 7.4 supported until 2030.
- Installing on Ubuntu 16.04 LTS
- FoolFuuka/Ayase Stack with Docker Compose - Miracles are happening here with a Percona MariaDB, PHP7, Asagi, Sphinxsearch orchestrator.
- Installing on Windows, Apache, Mysql, PHP - Not recommended but it can work.
Administration[edit]
- FoolFuuka/Start - Start up Asagi, Sphinxsearch, and FoolFuuka.
- FoolFuuka/Import - Import data from the dumps.
- FoolFuuka/Housekeeping - You may be ordered by the provider, by the state, or by a plea from the person themselves to clean up data. Here's how.
- FoolFuuka/Troubleshooting - Restart the site if its down or troubleshoot if there are problems.
- FoolFuuka/Upgrading - Upgrade FoolFuuka to the latest version.
- FoolFuuka/Plugins - Plugins and extensions you could use.
Architecture[edit]
An advanced section for setting up the architecture needed to run FoolFuuka/Asagi/SphinxSearch at scale.
- FoolFuuka/Nginx
- FoolFuuka/MySQL - We use Percona MariaDB, a version of MySQL. It is also tokudb.
- FoolFuuka/Sphinx/Advanced - At Desuarchive the search server serves both Desuarchive and Rbt.asia, and is hosted on a server seperate from the frontend, database, and scraper with OpenVPN connecting the two.
Backups[edit]
FoolFuuka/Asagi is notorious for their large and unwieldy database. We use this command to do daily and weekly dumps without getting locking problems.
mysqldump -u $DB_user -h $DB_host -p$DB_pass --opt --single-transaction --quick --lock-tables=false $DB $t | gzip > $DIR/$DB-$t.sql.gz
Dumping FoolFuuka Tables without leaking IPs or Passwords[edit]
CSV[edit]
Don't just release the above SQL dump into the wild. This script will dump to CSV, which is actually quite easy to import back.
maybe a good idea to drop the doc_id and media_id columns as 4plebs does https://archive.org/details/4plebs-org-data-dump-2018-01
SELECT `doc_id`, `media_id`, `num`, `subnum`, `thread_num`, `op`, `timestamp`, `timestamp_expired`, `preview_orig`, `preview_w`, `preview_h`, `media_filename`, `media_w`, `media_h`, `media_size`, `media_hash`, `media_orig`, `spoiler`, `deleted`, `capcode`, `email`, `name`, `trip`, `title`, `comment`, `sticky`, `locked`, `poster_hash`, `poster_country`, `exif`
FROM `pol`
INTO OUTFILE "/tmp/a.csv"
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY "\n";
SQL[edit]
mysqldump doesn't support dropping columns from a dump, so it's not usable for making public .sql dumps. One way to do this is with https://github.com/bbepis/mysqlchump. Example command:
mysqlchump --table a --connectionString "Server=127.0.0.1;Database=asagi;User Id=backup;Password=backuppassword;" --select 'SELECT `doc_id`, `media_id`, `num`, `subnum`, `thread_num`, `op`, `timestamp`, `timestamp_expired`, `preview_orig`, `preview_w`, `preview_h`, `media_filename`, `media_w`, `media_h`, `media_size`, `media_hash`, `media_orig`, `spoiler`, `deleted`, `capcode`, `email`, `name`, `trip`, `title`, `comment`, `sticky`, `locked`, `poster_hash`, `poster_country`, `exif` FROM `a` ORDER BY `doc_id` ASC' a.sql
With some external tooling, you can also do incremental dumps by adding a WHERE clause to the select query with the last `doc_id` from your most recent dump.
Automatic Full SQL DB Backups[edit]
crontab -e
0 3 * * * /bin/bash /home/atc/sql/full.sh
That script back up foolfuuka rbt_foolfuuka asagi
Iterates through each DB
/home/atc/sql/desu - Writes gzip’d SQL files here (normal to be empty)
Uploads to encrypted RClone folder
Automatic CSV Dumps as seen on Internet Archive[edit]
php /home/atc/backup.php
Backupbot Database[edit]
``` CREATE DATABASE backupbot DEFAULT CHARACTER SET utf8mb4 DEFAULT COLLATE utf8mb4_general_ci;
GRANT ALL PRIVILEGES ON `backupbot`. * TO 'backupbot'@'localhost'; GRANT FILE ON *.* TO 'backupbot'@'localhost'; GRANT SELECT ON `asagi`.* TO 'backupbot'@'localhost'; GRANT SELECT ON `foolfuuka`.* TO 'backupbot'@'localhost'; GRANT SELECT ON `rbt_foolfuuka`.* TO 'backupbot'@'localhost'; flush privileges;
use backupbot; create table backupstats ( board VARCHAR(20) PRIMARY KEY, lastid INT );
```