SUMMARY: cropping a file

From: Rich Glazier (rglazier2002@yahoo.com)
Date: Fri Feb 13 2004 - 14:43:27 EST


I thought there was a way using /dev/null, but I think
I was mistaken. Here are some of the suggestions.
Thanks to everyone.

(Chris Ruhnke, Bob Marcan, Alan Rollow, Steve Feehan,
Elin Vaeth, Jim Fitzmaurice, Paul A. Sand, James
Sainbury, Phillip Brown, John Farmer)

I ended up creating a cropped temp file, and then
copying it over the original:

tail -49 /usr/scripts/prod_on_mars.out > /tmp/newfile
rm /usr/scripts/prod_on_mars.out
mv /tmp/newfile /usr/scripts/prod_on_mars.out

----------

Hi,

Here is a solution. Let me know if you find a more
elegant way to do
this:

Put the following into a file:

ed filename.ext << _HERE_
1,100d
w
_HERE_

The above translates as use the 'ed' editor to edit
filename.ext.
Accept input to 'ed' until the string _HERE_ is
encountered.
1,100d Starting at line 1 delete 100 lines
w Write buffer contents
_HERE_ exit

Although this will execute very quickly (assuming
logfile isn't very
huge) there is a small window of opportunity for the
application that owns the file to experience a
synchronization problem.

Consider: The logfile is being modified by two
sources. The owner of
The file and your program. When the 'ed' command is
executed file contents are read into a buffer but the
file is not locked. There is potential for the
source file contents to change while ed is deleting
the first 100 lines. At the end of 100 line deletion
ed writes his copy of the contents back to the
logfile which will not include the updates that took
place between the time ed read the file and the time
he executes the 'w' command.

Let me know if this works for you.

John Farmer

-----------------------

Try this:

  # sed -n 1,100p filename

Regards,

Phillip S. Brown

-----------------------

I'm not sure what you mean by "dump", but I think this
truncates the
file at 100 lines:

    perl -pi -e 'exit if ($. > 100)' file

I think this creates a temporary file, but the details
are at least hidden from you.

 
 Paul A. Sand

----------------------

Just off the top of my head, I'd use wc -l to get a
line count, subract
100 from that number then use tail -<number> and
replace the file with
that result. Something like:

LINECOUNT=`cat $LOGFILE | wc -l`
NEWCOUNT=`expr $LINECOUNT - 100`

tail -$NEWCOUNT $LOGFILE > $NEWFILE

rm $LOGFILE

mv $NEWFILE $LOGFILE

exit 0

Jim Fitzmaurice

-----------------------

Perhaps the 'split' command will give you what you
want.

Regards,

Elin Vaeth

--- Rich Glazier <rglazier2002@yahoo.com> wrote:
> Does anyone have a suggestion about how to crop a
> file
> such that it always remain a certain size (in one
> step- i.e. not creating and deling files and not an
> editor)? I have a log file where I want to dump the
> fist 100 lines once a week. I've tried ways of
> using
> head and redirecting dev/null to the file, but can't
> quite get what I want.
>
> Thanks.
>
> T645.1A PK6
>
> __________________________________
> Do you Yahoo!?
> Yahoo! Finance: Get your refund fast by filing
> online.
> http://taxes.yahoo.com/filing.html

__________________________________
Do you Yahoo!?
Yahoo! Finance: Get your refund fast by filing online.
http://taxes.yahoo.com/filing.html



This archive was generated by hypermail 2.1.7 : Sat Apr 12 2008 - 10:49:51 EDT