I have a wild bunch of bookmark backups, ranging in format from json, html and even sqlite.
I'd like to text-search through all the files and pipe to a browser importable (ie firefox) text file all unique address links. I have NO concern for tags, time, or other metadata; only URL. Is something like
possible? Does grep work through places.sqlite files?
Other alternative, as advised previously in this forum something of the kind:
The problem here being that, sort command would not start differentiating from "http" or "www", so the result would be a mess. Therefore, solution might be a combination of (grep + sort) ?
I'd like to text-search through all the files and pipe to a browser importable (ie firefox) text file all unique address links. I have NO concern for tags, time, or other metadata; only URL. Is something like
$ grep <start text> "http" OR "www" <end text> "whatever" > out.file
possible? Does grep work through places.sqlite files?
Other alternative, as advised previously in this forum something of the kind:
$ sort -u file1 | file2
The problem here being that, sort command would not start differentiating from "http" or "www", so the result would be a mess. Therefore, solution might be a combination of (grep + sort) ?