named pipes question

Hi to all,

Please could you tell me if there is any way of using fifo files in full-duplex mode (read and write to at the same time).

My problem is that I want to pass to a fifo file two outputs at the same time - one huge log file via tail -f and the output of some other program in order to be appended to one fifo file which is continuously realtime-processed by a parser I have written in awk. At operation, I lose many log lines due to the fifo on-way locking.


Thank you.
 
Use a local socket instead. You can't write to it with command-line redirection, but a simple C program could perform the necessary fd replacements.

Your second paragraph seems inconsistent with the first. Do you keep the pipe open, or do you open it with each read/write operation? Maybe you need something like this:
Code:
{ tail -f ... &;
  awk ... &; } > named-pipe
wait
Kevin Barry
 
Thanks Kevin,

See below some clarification about my design:

There are two background processes that contiguously outputs both their stdouts to the same fifo file. This fifo file is processed real-time by the awk parser:

Code:
{ tail -F -n 1 logfile } > /var/tmp/fifofile &
{ <other program that sents to stdout> } > /var/tmp/fifofile &

cat /var/tmp/fifofile | awk {  <my parser code> } &

fyi, I use bash shell.
 
Do you even need a fifo? Unless you need to open it more than once, you could probably do this:
Code:
{ tail -F -n 1 logfile &
  <other program that sents to stdout> & } | awk {  <my parser code> } &
I still don't understand why you need duplexing.
Kevin Barry
 
it will never work.
you cannot write with 2 processes at the same time to a named pipe.
the other will start when the other stops.
also when one end of the pipe closes so does the other end.
in shell you cannot control this.

named pipes have very limited capabilities.
 
bigearsbilly said:
it will never work.
you cannot write with 2 processes at the same time to a named pipe.
the other will start when the other stops.
also when one end of the pipe closes so does the other end.
in shell you cannot control this.

named pipes have very limited capabilities.
Code:
mkfifo input
ls / > input &
date > input &
cat input
rm input
This works for me on both Linux and on FreeBSD.
Kevin Barry
 
ta0kira said:
Do you even need a fifo? Unless you need to open it more than once, you could probably do this:
Code:
{ tail -F -n 1 logfile &
  <other program that sents to stdout> & } | awk {  <my parser code> } &
I still don't understand why you need duplexing.
Kevin Barry

My intention is to keep the awk's main loop active in order parser to do it's processes in a periodic manner regardless if there are data on the log file or not. This is done by the second program that pipes its data on the same fifo file.
 
Back
Top