START ✍
./load.sh "2/in.csv" "2/out.csv" "2/status.csv" cat 2/out.csv | grep 000
cat 2/out.csv | grep 000 > 2/error.txt
./csv-to-txt-filter.sh "domain" "2/out.csv" "1/in.txt" ".pl"
./csv-to-txt-filter.sh "domain" "2/out.csv" "1/in.txt" ".de"
split csv to txt
./csv-to-txt.sh "domain" "2/out.csv" "1/in.txt"
Merge without duplicate columns
./merge-csv.sh "2/out.csv" "3/out.txt" "4/out.txt"
- compare ( value from in and current infrastructure )
- load ( from infrastructure )
values from in and current infrastructure
./compare.sh
load values from infrastructure
./load.sh
List of projects
./apimacro.sh
Run macro for 1/in.csv
./apimacro.sh 1
The same, with separated details, run macro for 1/in.csv
./apimacro.sh "1/in.csv" "1/out.csv" "1/status.csv"
filtering 3/in.csv
./apimacro.sh 3
Install ✍
EXAMPLES ✍
wszystkie dane są dostepne poprzez mikrousluge lub na serwerze FTP
prepare input data
- 1/domain-list.txt
example1.com
example2.com
example3.com
run script to create the input file
- 1/in.csv
./create-input-csv.sh "domain,https_status" "1/domain-list.txt" "1/in.csv"
output of script:
- 1/in.csv
domain,http_status_code
example1.com
example2.com
example3.com
./load.sh "1/in.csv" "1/out.csv" "1/status.csv"
dane wejsciowe:
- 1/in.csv
domain,http_status_code
example1.com
example2.com
example3.com
status wykonywania, aktualizowany po każdej iteracji:
- 1/status.csv
domain,http_status_code
example1.com,done
example2.com,warning
example3.com,
Dane wyjściowe:
- 1/out.csv
domain,http_status_code
example1.com,200
example2.com,500
example3.com,301
Another example with checking nameservers
Dane wejsciowe:
- 2/in.csv
domain,nameservers
example1.com
example2.com
example3.com
Dane wyjsciowe:
- 2/out.csv
domain,nameservers
example1.com,ns1.com,ns2.com,ns3.com
example2.com,ns1.com,ns2.com,ns3.com
example3.com,ns1.com,ns2.com,ns3.com
Another example with making screenshots
Dane wejsciowe:
- 3/in.csv
domain,nameservers
example1.com
example2.com
example3.com
Dane wyjsciowe:
- 3/out.csv
domain,nameservers
example1.com,/home/user/example1.com.png
example2.com,/home/user/example1.com.png
example3.com,/home/user/example1.com.png
Dane wyjsciowe mogą posłużyć do sprawdzenia stanu:
Dane wejsciowe:
- 4/in.csv
domain,nameservers
example1.com,ns1.com,ns2.com,ns3.com
example2.com,ns1.com,ns2.com,ns3.com
example3.com,ns1.com,ns2.com,ns3.com
Dane wyjsciowe:
- 4/out.csv
domain,nameservers
example1.com,ns1.com,ns2.com,ns3.com
example2.com,ns1.com,ns2.com,ns3.com
example3.com,ns1.com,ns2.com,ns3.com
now the filtering is possible with external bash script: Headers in CSV:
- data column
- bash script commmand name
- First filter name
- First filter value
- Second filter name
- Second filter value
Examples
http_status_code,equal,200
http_status_code,not_equal,200
http_status_code,more_than,200,less_than,300
http_status_code,remove-duplicates
3/in.csv
./apimacro.sh 9
TODO ✍
- Clear input data from CSV:
"
[space]
will be good use some standrds such xpath or another to create more advanced examples format for filtering
http_status_code(http_status_code!=200 && http_status_code!=300)