Shell script to log in to a website

Hello everyone,

I need to download a file from a website. But first I need to login on it.
Is there a possibility eg. with fetch to achieve that?
I know that it is possible with curl or wget, but I have not tried it yet. I prefer to go out of the box with fetch.
It's about the website https://porkbun.com/account/login

Thank you in advance.
 
OK, thank you. Too bad that fetch can not do it. Does anyone have experience with it, curl or wget?
I quickly tried curl, without success.

Code:
# cd /tmp/curl_test
# curl -c cookie.txt -d "loginUsername=***" -d "loginPassword=***" https://porkbun.com/account/login
# curl -b cookie.txt https://porkbun.com/account/ssl/my.domain?download=1
# curl: No match.

Of course you have to remember that it is not a direct link to the file. Like https://domain.tld/file.zip
 
I'll have a look tomorrow, at work I have a Jenkins job running a shell script that fetches the latest Oracle Java JDK (Linux) once a week. The script has to jump through a couple of hoops (accept UELA, accept cookies, sort through a HTML page and whatnot) in order to get to the download link. It's usually not that difficult. Finding the right variables to send back, to the right URL, in the correct order, can be a bit tricky though.
 
This is my script, it's not the prettiest script in the world but it works.
Code:
#!/bin/sh

jdk_version=${1:-8}

readonly oracle_url="https://www.oracle.com"
readonly proxy="http://internal-proxy:8080"

 

get_list() {
  readonly jdk_download_page="${oracle_url}/technetwork/java/javase/downloads/index.html"
  download_url=$(curl -x ${proxy} -s ${jdk_download_page} | \
    egrep -o "\/technetwork\/java/\javase\/downloads\/jdk${jdk_version}-downloads-.+?\.html" | \
    head -1 | cut -d '"' -f 1
  )

  files_to_download=$(curl -x ${proxy} -s "${oracle_url}${download_url}" | \
    egrep -o "http\:\/\/download.oracle\.com\/otn-pub\/java\/jdk\/[8-9](u[0-9]+|\+).*\/jdk-${jdk_version}.*(-|_)linux-(x64|x64_bin).rpm"
  )

  return
}

download_files() {

  [ ! -d downloads ] && mkdir downloads

  for download in ${files_to_download}; do
    cd downloads
    curl -4 -L -C - -O -x ${proxy} \
      -b "oraclelicense=accept-securebackup-cookie" \
      -k ${download}
    cd -
  done

  return
}
 
# Main
get_list
download_files
 
Back
Top