I was in a bit of a bind on this customer’s task as there was about 1500 data files on SERVER1 and it had to be transferred to a specific network share on SERVER2 because the clients were running an app that imports these and turns them into working tables. Also they wanted me to create a Scheduled task that runs every day, and once it’s done copying the files they wanted a list of files that was copied, at what date, and what time.
Sure, it sounds simple, and I c ould complicate the script a bit more, but keeping it simple for the customer was probably the best (safest) way to go.
SCRIPT
###### Author: caffeinatedadmin.home.blog ######
###### Discover all files whose name start with the word "ABC" in the title ######
$file = Get-ChildItem "\\SERVER1\ProcessedFiles\*ABC*.dat" -File | Where-Object {($_.LastWriteTime -gt (Get-Date).AddDays(-1))}
$destination = "\\SERVER2\Share$\ABC"
$logcontent = $file | Select Name, LastWriteTime
###### Copy item from discovery above to the destination path ######
Copy-Item -Path $file -Destination $destination -Force -Recurse -Confirm:$false
###### Log file ######
$logpath = Test-Path "$destination\TransferLogs"
if ($logpath -eq $true){
$log = $logcontent | Out-File "$destination\TransferLogs\ABC_$(Get-Date -Format dd-MM-yyyy--hh-mm).txt" -Append -Encoding ASCII
}
else {
$CreateLogpath = new-item "$destination\TransferLogs" -type directory
$log = $logcontent | Out-File "$destination\TransferLogs\ABC_$(Get-Date -Format dd-MM-yyyy--hh-mm).txt" -Append -Encoding ASCII
}
BREAKDOWN
STEP 1
The first part of the script is self-explanatory if you’re not new to this.
$file = Get-ChildItem "\\SERVER1\ProcessedFiles\*ABC*.dat" -File | Where-Object {($_.LastWriteTime -gt (Get-Date).AddDays(-1))}
The label $file is looking for all files in a given path and ignores the directories, where the files have the word “ABC” in the name, and extension “.dat”. The piped where query is filtering this further by looking at the LastWriteTime attribute, which basically means when this particular file was last modified by the initial export. Here I told the where query to also fetch today’s date and then go back one day so that the entire label will only focus on files modified or created from yesterday. This was my basis. And now, if you run this on your computer, you get this:

The label $destination , of course, simply states the destination where the files will be copied in the next step.
Now the label $logcontent is there for my third step of creating the log files. The content itself, as you can see, simply formats the information from the $file label that I want as the content of the log file. If we run it on our dummy folder:

STEP 2
We’re ready to transfer our files, now, and you can’t go wrong with “Copy-Item”
Copy-Item -Path $file -Destination $destination -Force -Recurse -Confirm:$false
If you simply use the -Confirm parameter, you will be prompted to confirm your action.

So we’ll disable it by setting the parameter to false: -Confirm:$false
STEP 3
Now we’ve approached the notorious creation of custom log files. In truth I could have tanked it up with loads of information, but clients get confused very easily, as we all know. So I wanted to simply have the $logcontent label transferred into a .txt file.
Then I decided that the file names and time of transfer would be the names of the files.
One obstacle I ran into was the thought that the client will probably be deleting my log folder so I had to make sure it was uniformly there. Which means, we’re not only going to have our script create a log, but it will also check for the existence of the Log file directory I called “TransferLogs”.
$logpath = Test-Path "$destination\TransferLogs"
if ($logpath -eq $true){
$log = $logcontent | Out-File "$destination\TransferLogs\ABC_$(Get-Date -Format dd-MM-yyyy--hh-mm).txt" -Append -Encoding ASCII
}
else {
$CreateLogpath = new-item "$destination\TransferLogs" -type directory
$log = $logcontent | Out-File "$destination\TransferLogs\ABC_$(Get-Date -Format dd-MM-yyyy--hh-mm).txt" -Append -Encoding ASCII
}
I did that with the ever-so-lovely “if” statement.
- The logpath is “$destination\TransferLogs”
- If the $logpath is true (exists) then go to the next step of creating the log, if it isn’t then start the else statement.
- The else statement first creates the path and then follows the order logics to create the log file.
You will notice that the actual log creation cmdlet Out-File is has some funky stuff where the name of log file should be:
Out-File “$destination\TransferLogs\ABC_$(Get-Date -Format dd-MM-yyyy–hh-mm).txt”
The “ABC_” will be part of the name, whereas $ symbol, obviously is a variable in the pipe-line that connects it to a mini-cmdlet in the brackets, which fetches the current time and formats it to show day-month-year–hour-minutes.
Results
Our destination folder was completely empty at the beginning. And now we can see the content we wanted:

We can see that our Get-item cmdlet moved only the files with the word “ABC” in the title, and left out the rest in the Origin folder:

If we take a look at the TransferLogs folder, we see our log for today, showing the name of the files, date, hours and minutes of transfer:

And our content of the log file shows the client what was transferred in that moment:


