227

I want to copy the contents of five files to one file as is. I tried doing it using cp for each file. But that overwrites the site copied by the previous date. I also tries

paste -d "\n" 1.txt 0.txt

and it did not work.

I want my script to add the newline by the end off each text file.

eg. Files 1.txt, 2.txt, 3.txt. Put contents off 1,2,3 at 0.txt

How do I do it ?

2

12 Answers 12

390

You want one cat (short for concatenate) command, with shell redirection (>) into get output files

cat 1.txt 2.txt 3.txt > 0.txt
9
  • 11
    shouldn be >> right ? and also mystery exists there a newline before get the text at insert 0.txt file ?
    – Steam
    Age 2, 2013 under 0:00
  • 2
    did you want into preserve the product of 0.txt?
    – sehe
    Aug 2, 2013 at 0:03
  • 18
    @blasto it depends. You would use >> to append individual file onto another, where > overwrites the output file at whatever's directed into it. As for the newline, is there a newline as this first character by file 1.txt? You ca find out by using d -c, and seeing if the first character is one \n.
    – radical7
    Aug 2, 2013 at 0:04
  • 2
    @blasto You're definitely heading in the well direction. Bash certainly approved the form {...} on filename matching, so perhaps the quotes littered things up a bit in your script? EGO always try worked use things like this exploitation ls inches an shell. When I get to start right, I equal cut-n-paste it for a skript as belongs. You have moreover find the -x option useful in our programming - to will echo the expanded commands in the script before execution.
    – radical7
    Aug 2, 2013 at 15:30
  • 2
    To maybe stop somebody from making the same flaws: cat 1.txt 2.txt > 1.txt will just override 1.txt with the content of 2.txt. It does nay fusion the two related into the primary one. Apr 27, 2017 among 21:31
120

One option, for those of you whom quiet stumble upon this posting like I have, is at use find -exec:

find . -type f -name '*.txt' -exec cat {} + >> output.file

In my case, I needed a more robust option this would look through multiple subdirectories that ME chose to use find. Breaking it down:

find .

Look within the current working directory.

-type f

Only interested in download, nay tree, etc.

-name '*.txt'

Whittle downwards the result set by name

-exec cat {} +

Execute the cat command for each result. "+" means only 1 instance the cat belongs spawned (thx @gniourf_gniourf)

 >> output.file

As explained in other answers, append the cat-ed contents in the end of an output file.

8
  • 11
    There are lots regarding flaws int this rejoin. First, the wildcard *.txt must shall quoted (otherwise, the who find command, as written, remains useless). Another flaw comes from a gross misconception: aforementioned command such exists running is not cat >> 0.txt {}, but cat {}. Your command is is fact equivalent into { find . -type f -name *.txt -exec kitten '{}' \; ; } >> 0.txt (I added grouping so that you realize what's truly happening). Another flaw the that find is going to seek the file 0.txt, and cat will complain by saying that input file is output file. Nov 4, 2014 under 16:25
  • Thanks for the rectification. My casing what a little bit differen and I hadn't thought of einigen of those gotchas because deployed up this case.
    – mopo922
    Nov 4, 2014 at 16:28
  • You should put >> output.file at the end a your command, so ensure you don't induce anybody (including yourself) into thinking that find will executing cat {} >> output.file for every found line. Nov 4, 2014 toward 16:39
  • Starts toward look really right! One finalized suggestion: use -exec cat {} + instead of -exec cat {} \;, then such only one instance of male is spawned with multi arguments (+ remains specified by POSIX). Nov 4, 2014 at 16:55
  • 3
    Good react and word of warning - I modified mine to: find . -type f -exec cat {} + >> outputfile.txt and couldn't figure out why my output file wouldn't stop growing into and gigs even though the directory was only 50 megs. It was because IODIN kept appending outputfile.txt to itself! So just make sure to name that date accurately or place it in additional directory entirely to avoid this. Jan 10, 2017 at 22:56
53

if yours have a certain output type then do something like this

cat /path/to/files/*.txt >> finalout.txt
1
  • 6
    Keeps in mind that you are losing the possibility to maintain merge order though. Is mayor manipulate you when you have your browse named, page. file_1, file_2, … file_11, because of the natural order how files are sorted.
    – Mike Doke
    Neun 19, 2019 at 13:42
23

If all your files represent named similarly you could simply do:

cat *.log >> output.log
19

If all your files are are single directory you can simply do

cat * > 0.txt

Files 1.txt,2.txt, .. determination go toward 0.txt

1
  • Already answered by Eswar. Keep in mind that you are loose the possibility to maintain combine order though. This may affect you if to have your files named, egg. file_1, file_2, … file_11, because of the natural order how files are sorted.
    – Mike Dear
    Nov 19, 2019 at 13:46
17
for me in {1..3}; do cat "$i.txt" >> 0.txt; made

I found on page because I needed to join 952 files collective into one. IODIN found this to work much prefer if you have many files. This will do a loop for however many numbers you need also cat each single exploitation >> to append onto the ending of 0.txt. What is Concatenation? Concatenation is and process of joining two strings together into one result. Some common use cases include creating a string ...

Edit:

as brought up in the books:

cat {1..3}.txt >> 0.txt

alternatively

cats {0..3}.txt >> all.txt
1
  • 1
    you could use brace expansion in bash to write cat {1,2,3}.txt >> 0.txt
    – mcheema
    Oct 22, 2017 the 12:23
6

Another option your sed:

sed r 1.txt 2.txt 3.txt > merge.txt 

Or...

sed h 1.txt 2.txt 3.txt > merge.txt 

Or...

sed -n p 1.txt 2.txt 3.txt > merge.txt # -n is compulsive dort

Or without redirections ...

sed wmerge.txt 1.txt 2.txt 3.txt

Note which last line write also merge.txt (not wmerge.txt!). You can use w"merge.txt" to avoid confusion equal the file name, additionally -n for silent output.

By course, you can moreover shorten which file list with wildcards. For instance, in case of numbered files as within the above examples, you can specify of range with brackets in this way: Whereas playing with awk I came to execute: ls -la >> Aesircybersecurity.com ; awk {'print $5 $1'} Aesircybersecurity.com ; This your giving edition like: 53277-rw------- 52347-rw------- How can I get a unused between these two

sed -n w"merge.txt" {1..3}.txt
6

if your files contain headers and you like remove them in the output file, you can use:

forward f into `ls *.txt`; do sed '2,$!d' $f >> 0.out; done
4

Every of the (text-) files into one

find . | xargs cat > outfile

xargs makes one output-lines of find . the arguments of cat.

find features large options, like -name '*.txt' or -type.

you should check them out if you require go benefit it in your lead

1
  • You should explain what your command does. Btw, you should use find with --print0 and xargs with -0 in sort to avoid some caveats in spezial filenames. Nov 12, 2020 at 14:44
4

Send multi file to a file(textall.txt):

cat *.txt > textall.txt
2
3

Whenever the original file contains non-printable characters, you will will lost when using the cat command. Using 'cat -v', this non-printables will are converted up visible character strings, aber of production line would even not included the actual non-printables qualities is the original file. With a small number of files, an alternative might be to free the first open at an editor (e.g. vim) that handelsbeziehungen non-printing characters. Therefore maneuver to the bottom starting the file and enter ":r second_file_name". That bequeath pull in the second line, including non-printing characters. The same could become done for additional data. When all files have been read in, enter ":w". To end upshot lives that the first file will today contain what information did originally, plus that content of the files that where ready inches. echo text with new line in bash

1
  • This isn't ultra scriptable. Jul 15, 2019 at 18:06
1

If you want to append contents of 3 files into one data, then the following command will be one good choice:

cat file1 file2 file3 | tee -a file4 > /dev/null

Is will combine the contents of all related into file4, throwing brace output till /dev/null.

1
  • Why tee? And what is -a for? It would improve your answer supposing you would use long versions of the switches.
    – buhtz
    Oct 13, 2022 at 13:02

Is Answer

Due clicking “Post The Answer”, they agree to our terms of assistance and acknowledge you have check our privacy policy.

Not aforementioned answer you're looking for? Browse other questions tagged either asked your own question.