When a client makes a request to a web site asking it to "serve" a given web page, an HTTP response code is generated by the server, indicating the status of the request. Broadly, there are five classes of HTTP status codes, the first digit of which indicates the response type:
Status Code | Class Descriptor |
---|---|
1xx | Informational |
2xx | Success |
3xx | Redirection |
4xx | Client Errors |
5xx | Server Errors |
With this information in hand, here's a quick--though imperfect--way to check a site's availability by parsing HTTP response codes:
#!/usr/bin/env bash
url="https://geekberg.info"
url_check(){
status_code=$(curl --output /dev/null --silent --head --write-out '%{http_code}\n' $url)
if [ $status_code -ne "200" ] ; then
printf "%s\\n" "BAD URL"
else
printf "%s\\n" "GOOD URL"
fi
}
.
In this function, we first create a local variable--status_code
--asking curl to direct its output to /dev/null
(a null file which discards what is written to it, but preserves the "write" status of the operation). We also ask curl
to repress its progress bar with the --silent
switch, and look only at a page's headers with the --header
option.
From there, curl
displays a string, via write-out
, of the numerical response code (http_code
) from the requested URL to standard out
. We examine this status_code
with an if..then conditional. If the status code DOES NOT equal "200"... then there's likely an issue with our request.
As stated earlier, our approach isn't bullet-proof, but it sure beats ping.
Cheers.