Here are the frontend and backend servers.
Please install nginx and configure its setting. A sample file nginx_setting is provided.
sudo apt install nginx
sudo systemctl start nginxA domain name and corresponding certificate is required for this server. To automatically issue, renew and install the certificate, tools of Certbot and ACME are recommended.
Please install MySQL and create an account for superb database.
- Install
sudo apt install mysql-server
sudo mysql_secure_installation
sudo mysql
- Create an account
CREATE USER '<username>'@'localhost' IDENTIFIED WITH mysql_native_password BY '<password>';
GRANT ALL PRIVILEGES ON *.* TO '<username>'@'localhost' WITH GRANT OPTION;- Create a
superbdatabase
CREATE database superb;Please refer to ./frontend/
Please refer to ./backend/
ALTER TABLE superb.scores
ADD ${column_name} float;- Please mind that
${column_name}=${task}_${metric}_${mode}. e.g.PR_per_publicandQbE_mtwv_hidden.
- Put your ground truth in
./backend/inference/truth/${task}folder. - Modify
./backend/calculate.pywith the following template:
#============================================#
# ${task} #
#============================================#
# ${task} {public/hidden}
if os.path.isdir(os.path.join(predict_root, "${task_folder}")):
if os.path.isfile(os.path.join(predict_root, "${task_folder}", "${gt_file}")):
if #what user uploaded is valid:
print("[${task} ${public/hidden}]", file=output_log_f)
try:
score = # calcuation
print(f"${task}: ${metric} {score}", file=output_log_f)
score_model.${task}_${metric}_${mode} = score
session.commit()
except Exception as e:
print(e, file=output_log_f)- Modify
./backend/models/score.py. Add a column like:
${task}_${metric}_${mode} = db.Column(db.Float)- Modify
./backend/models/naive_models.py. Add a column toScoreModelclass like:
${task}_${metric}_${mode} = db.Column(db.Float)-
Modify
./backend/configs.yaml. Add your new task info to theSCOREsection ofINDIVIDUAL_SUBMISSION_INFOandLEADERBOARD_INFO. -
(Optional) Add calcuated scores of your new task/metric for official models by modifying the
get_leaderboard_default()function defined in./backend/utils.py.
- Append the
individual_submission_columnInfoarray in./frontend/src/Data.jswith:
${task}_${metric}_${mode}: {
header: "${task} ${mode}",
width: 100,
higherBetter: false,
isScore: true,
type: "number",
},- Append the
leaderboard_columnInfoarray in./frontend/src/Data.jswith:
${task}_${metric}_${mode}: {
header: "${task} ${mode}",
width: 100,
higherBetter: false,
isScore: true,
type: "number",
},ALTER TABLE superb.files
ADD ${column_name} bigint unsigned default NULL; -- if required: remove "default NULL"-
Modify
./backend/models/file.py. Add a column like:from sqlalchemy.dialects.mysql import BIGINT ${column_name} = db.Column(BIGINT)
-
Modify
./backend/models/naive_models.py. Add a column toFileModelclass like:from sqlalchemy.dialects.mysql import BIGINT ${column_name} = db.Column(BIGINT)
-
Modify
./backend/configs.yaml. Add your new task info to theFILEsection ofINDIVIDUAL_SUBMISSION_INFOandLEADERBOARD_INFO. -
(Optional) Add fields of your new task/metric for official models by modifying the
get_leaderboard_default()function defined in./backend/utils.py.
-
Append the
individual_submission_columnInfoarray in./frontend/src/Data.jswith:${column_name}: { header: "${column_name}", width: 100, higherBetter: false, isScore: true, type: "number", },
-
Append the
leaderboard_columnInfoarray in./frontend/src/Data.jswith:${column_name}: { header: "${column_name}", width: 100, higherBetter: false, isScore: true, type: "number", },