jirib writes
Hi,
it would be cool if there would be a helper script like urlview which would work as "filter" for search results (‘recollq -m "pattern"’) in cli and would display something like this:
UrlView 0.9: (5 matches) Press Q or Ctrl-C to Quit!
- > 1 file:///home/jirib/doc/it/essential_system_administration_3rd_edition.pdf]
2 www.it-ebooks.info
3 file:///home/jirib/doc/it/essential_system_administration_3rd_edition.pdf
4 file:///home/jirib/doc/it/the_definitive_guide_to_mysql_5_3rd_edition.pdf]
5 file:///home/jirib/doc/it/the_definitive_guide_to_mysql_5_3rd_edition.pdf
…and open files via xdg-open.
But it would be nice to have a key to see metadata of each entry (also above used urlview obviously does filtering of input incorrectly). There’s also ‘extract_url’ but its curses style is too overkill imo.
medoc writes
Hi,
What’s wrong with typing "recoll -q some request" and then using the GUI ?
jirib writes
There’s nothing wrong but some people are console junkies :)
medoc writes
Ok, maybe a variation on:
----` #!/bin/sh NRES=20
tempfile=(tempfile) 2 >/dev/null
|| tempfile=/tmp/test$$
trap "clear;rm -f $tempfile" 0 1 2 3 9 15
urls=recoll -t -n $NRES -b -q $@ 2 > /dev/null | \
while read line;do
echo $line | sed -e 's/ /%20/g'
done
dialog --no-items --menu "Hello" 24 80 20 $urls 2 > $tempfile || exit 1
xdg-open cat $tempfile
----`
medoc writes
Also, I forgot about this but the recoll webui works fine with the links terminal web browser. See the news paragraph from 2014-02-27 on the recoll.org front page (search for mutt).
medoc writes
Not one but two existing solutions :) !