Writing a complete bash script Enter bash. Then create a file called "hosts" that has the following entries: bash-4.2$ cat hosts www.duckduckgo.com www.eff.org www.google.com www.msnbc.com...

1 answer below »
This assignment is about writing a bash script that loops over the contents of the "hosts" file; the end result will be that it creates a ".cookies" file for each of the websites.
The instructions is attached.


Writing a complete bash script Enter bash. Then create a file called "hosts" that has the following entries: bash-4.2$ cat hosts www.duckduckgo.com www.eff.org www.google.com www.msnbc.com en.wikipedia.org www.yahoo.com www.zillow.com This assignment will be a complete bash script named "assign6.sh" that should, when executed, do the following: bash-4.2$ ./assign6.sh Doing www.duckduckgo.com with command : wget -H -q -P work-dir -e robots=off -p -r -l 1 --keep-session-cookies --save-cookies www.duckduckgo.com.cookies https://www.duckduckgo.com mkdir: created directory ‘work-dir’ real 0m4.061s user 0m0.066s sys 0m0.078s Doing www.eff.org with command : wget -H -q -P work-dir -e robots=off -p -r -l 1 --keep-session-cookies --save-cookies www.eff.org.cookies https://www.eff.org mkdir: created directory ‘work-dir’ real 4m20.190s user 0m1.817s sys 0m1.613s Doing www.google.com with command : wget -H -q -P work-dir -e robots=off -p -r -l 1 --keep-session-cookies --save-cookies www.google.com.cookies https://www.google.com mkdir: created directory ‘work-dir’ real 0m26.349s user 0m0.431s sys 0m0.360s Doing www.msnbc.com with command : wget -H -q -P work-dir -e robots=off -p -r -l 1 --keep-session-cookies --save-cookies www.msnbc.com.cookies https://www.msnbc.com mkdir: created directory ‘work-dir’ real 2m4.089s user 0m2.081s sys 0m1.376s Doing en.wikipedia.org with command : wget -H -q -P work-dir -e robots=off -p -r -l 1 --keep-session-cookies --save-cookies en.wikipedia.org.cookies https://en.wikipedia.org mkdir: created directory ‘work-dir’ real 5m18.904s user 0m8.457s sys 0m5.398s Doing www.yahoo.com with command : wget -H -q -P work-dir -e robots=off -p -r -l 1 --keep-session-cookies --save-cookies www.yahoo.com.cookies https://www.yahoo.com mkdir: created directory ‘work-dir’ real 4m37.653s user 0m1.906s sys 0m2.533s Doing www.zillow.com with command : wget -H -q -P work-dir -e robots=off -p -r -l 1 --keep-session-cookies --save-cookies www.zillow.com.cookies https://www.zillow.com mkdir: created directory ‘work-dir’ real 0m0.679s user 0m0.025s sys 0m0.013s Clearly, your "assign6.sh" script should loop over the contents of the "hosts" file; the end result will be that it creates a ".cookies" file for each of the websites using the "wget" that you can see above. Your script should have the same output also as the above script (do please include the timing element in your script.) When your’s finishes running (it probably will take 20-30 minutes unless you are clever about your implementation), you should run "wc" over the result files and see something like: bash-4.2$ wc -l *.cookies 109 en.wikipedia.org.cookies 4 www.duckduckgo.com.cookies 47 www.eff.org.cookies 11 www.google.com.cookies 31 www.msnbc.com.cookies 39 www.yahoo.com.cookies 9 www.zillow.com.cookies 250 total It's not likely (though certainly possible) that your wc output will be identical; it is conceivable that your "wc" output is quite different since most of these websites change on a regular basis. It is very important to notice that the temporary subdirectory "work-dir" is being created each time, and then deleted at the end of each iteration. This is an easy assignment. My reference version only has 9 lines of active code, two blank lines, and the initial comment containing #!/bin/bash. Please submit your "assign6.sh" script (no need to submit any output or the hosts file.)
Answered Same DayJul 07, 2021

Answer To: Writing a complete bash script Enter bash. Then create a file called "hosts" that has the following...

Mani answered on Jul 08 2021
155 Votes
#!/bin/bash
get_final_result() {
echo "Doing $@ with command: wget -H -q -P work-dir -e robots=o
ff -p -r -l 1 --keep-session-cookies --save-cookies [email protected] $@"
time wget -H -q -P work-dir -e robots=off -p -r -l 1 --keep-session-cookies --save-cookies [email protected] $@
echo ""
}
while...
SOLUTION.PDF

Answer To This Question Is Available To Download

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here