Skip to content

Instantly share code, notes, and snippets.

View paulo-amaral's full-sized avatar
🚀
"Ship in days, plan in months, dream in years"

Paulo Sérgio Amaral paulo-amaral

🚀
"Ship in days, plan in months, dream in years"
View GitHub Profile
@paulo-amaral
paulo-amaral / install_mb.sh
Last active April 8, 2025 01:47
script to install metabase and all packages in Debian 12
#!/bin/bash
## Script to install metabase on Debian 12
## Author: Paulo Amaral
## Note: Please change db, password in the environment part of the script
# Define colors
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m' # No Color
@paulo-amaral
paulo-amaral / security.conf
Last active December 8, 2023 18:18
best-nginx-configuration-for-security-drupal
#Paulo sérgio amaral - 2021
#A+ FOR DRUPAL - BEHIND NGINX REVERSE PROXY
#Security headers, mozilla observatory and others.
#Add this line on your server config and create a file into /etc/nginx:
#include security.conf;
#start headers config
add_header X-Frame-Options "SAMEORIGIN" always;
add_header X-XSS-Protection "1; mode=block" always;
add_header X-Content-Type-Options "nosniff" always;
@paulo-amaral
paulo-amaral / finalize_recording.sh
Last active March 28, 2024 05:59
Update the recording file name in Jibri - Part of JITSI MEET
#!/bin/bash
#Script to save records by date to better organize your recordings
#Use on :
#mounted NFS SHARE VIA FSTAB or local folder.
#The folder to save records is configured on Jibri json file (config.json.
#Dependencies : mailutils
RECORDINGS_DIR=$1
# Find latest modified directory
@paulo-amaral
paulo-amaral / install-latest-docker-compose.sh
Created June 1, 2020 06:21
Install latest version of Docker and Docker Compose
#!/bin/bash
set -e
#apt-get install -y curl
cat << "EOF"
___ _ _ ____ _____ _ _ _
|_ _| \ | / ___|_ _|/ \ | | | |
| || \| \___ \ | | / _ \ | | | |
| || |\ |___) || |/ ___ \| |___| |___
|___|_| \_|____/ |_/_/ \_\_____|_____|
@paulo-amaral
paulo-amaral / mysql_to_big_query.sh
Created March 8, 2018 15:56 — forked from shantanuo/mysql_to_big_query.sh
Copy MySQL table to big query. If you need to copy all tables, use the loop given at the end. Exit with error code 3 if blob or text columns are found. The csv files are first copied to google cloud before being imported to big query.
#!/bin/sh
TABLE_SCHEMA=$1
TABLE_NAME=$2
mytime=`date '+%y%m%d%H%M'`
hostname=`hostname | tr 'A-Z' 'a-z'`
file_prefix="trimax$TABLE_NAME$mytime$TABLE_SCHEMA"
bucket_name=$file_prefix
splitat="4000000000"
bulkfiles=200