All posts by caffeinatedadm

[PS] Copy files older than “N” days & Create LOGS

I was in a bit of a bind on this customer’s task as there was about 1500 data files on SERVER1 and it had to be transferred to a specific network share on SERVER2 because the clients were running an app that imports these and turns them into working tables. Also they wanted me to create a Scheduled task that runs every day, and once it’s done copying the files they wanted a list of files that was copied, at what date, and what time.

Sure, it sounds simple, and I c ould complicate the script a bit more, but keeping it simple for the customer was probably the best (safest) way to go.

SCRIPT

###### Author: caffeinatedadmin.home.blog ######
 
###### Discover all files whose name start with the word "ABC" in the title  ######
 
$file = Get-ChildItem "\\SERVER1\ProcessedFiles\*ABC*.dat" -File | Where-Object {($_.LastWriteTime -gt (Get-Date).AddDays(-1))}
$destination = "\\SERVER2\Share$\ABC"
$logcontent = $file | Select Name, LastWriteTime
  
 ###### Copy item from discovery above to the destination path ######
 
Copy-Item -Path $file -Destination $destination -Force -Recurse -Confirm:$false
 
###### Log file ######
  
    $logpath = Test-Path "$destination\TransferLogs"
    
    if ($logpath -eq $true){
    $log = $logcontent | Out-File "$destination\TransferLogs\ABC_$(Get-Date -Format dd-MM-yyyy--hh-mm).txt" -Append -Encoding ASCII 
 
    }
    else { 
    $CreateLogpath = new-item "$destination\TransferLogs" -type directory
    $log = $logcontent | Out-File "$destination\TransferLogs\ABC_$(Get-Date -Format dd-MM-yyyy--hh-mm).txt" -Append -Encoding ASCII 
 
    }

BREAKDOWN

STEP 1

The first part of the script is self-explanatory if you’re not new to this.

$file = Get-ChildItem "\\SERVER1\ProcessedFiles\*ABC*.dat" -File | Where-Object {($_.LastWriteTime -gt (Get-Date).AddDays(-1))}

The label $file is looking for all files in a given path and ignores the directories, where the files have the word “ABC” in the name, and extension “.dat”. The piped where query is filtering this further by looking at the LastWriteTime attribute, which basically means when this particular file was last modified by the initial export. Here I told the where query to also fetch today’s date and then go back one day so that the entire label will only focus on files modified or created from yesterday. This was my basis. And now, if you run this on your computer, you get this:

The label $destination , of course, simply states the destination where the files will be copied in the next step.

Now the label $logcontent is there for my third step of creating the log files. The content itself, as you can see, simply formats the information from the $file label that I want as the content of the log file. If we run it on our dummy folder:


STEP 2

We’re ready to transfer our files, now, and you can’t go wrong with “Copy-Item”

Copy-Item -Path $file -Destination $destination -Force -Recurse -Confirm:$false

If you simply use the -Confirm parameter, you will be prompted to confirm your action.

So we’ll disable it by setting the parameter to false: -Confirm:$false

STEP 3

Now we’ve approached the notorious creation of custom log files. In truth I could have tanked it up with loads of information, but clients get confused very easily, as we all know. So I wanted to simply have the $logcontent label transferred into a .txt file.

Then I decided that the file names and time of transfer would be the names of the files.

One obstacle I ran into was the thought that the client will probably be deleting my log folder so I had to make sure it was uniformly there. Which means, we’re not only going to have our script create a log, but it will also check for the existence of the Log file directory I called “TransferLogs”.

$logpath = Test-Path "$destination\TransferLogs"
    
    if ($logpath -eq $true){
    $log = $logcontent | Out-File "$destination\TransferLogs\ABC_$(Get-Date -Format dd-MM-yyyy--hh-mm).txt" -Append -Encoding ASCII 
 
    }
    else { 
    $CreateLogpath = new-item "$destination\TransferLogs" -type directory
    $log = $logcontent | Out-File "$destination\TransferLogs\ABC_$(Get-Date -Format dd-MM-yyyy--hh-mm).txt" -Append -Encoding ASCII 
 
    }

I did that with the ever-so-lovely “if” statement.

  1. The logpath is “$destination\TransferLogs”
  2. If the $logpath is true (exists) then go to the next step of creating the log, if it isn’t then start the else statement.
  3. The else statement first creates the path and then follows the order logics to create the log file.

You will notice that the actual log creation cmdlet Out-File is has some funky stuff where the name of log file should be:

Out-File “$destination\TransferLogs\ABC_$(Get-Date -Format dd-MM-yyyy–hh-mm).txt”

The “ABC_” will be part of the name, whereas $ symbol, obviously is a variable in the pipe-line that connects it to a mini-cmdlet in the brackets, which fetches the current time and formats it to show day-month-year–hour-minutes.

Results

Our destination folder was completely empty at the beginning. And now we can see the content we wanted:

We can see that our Get-item cmdlet moved only the files with the word “ABC” in the title, and left out the rest in the Origin folder:

If we take a look at the TransferLogs folder, we see our log for today, showing the name of the files, date, hours and minutes of transfer:

And our content of the log file shows the client what was transferred in that moment:

[PS]Remotely Check Installed Software Versions

Whichever solution you are using for your third party patching, it’s always good to have live data that you know you can rely on.

Last time I changed companies, I ran into a peculiar issue of having my compliance at 36% which was unacceptable. After some digging, of course, I realized that the damn environment had a gazillion of old versions or two versions installed, etc. SCCM would patch the proper versions but it would still return a weird error about the certificate not being recognized.

So let’s take for an example Chrome patching, seeing that Chrome’s got a new version popping up every month almost.

Instead of UNC-ing to check the chrome.exe file version in the Details Property tab, I wrote a few lines that actually shows me any exe that is not of the latest version.

In the below example I’ve group the patches for Flash player, and Notepad++ that were failing on some device.

$list = get-content C:\Temp\list.txt
 
$export = Foreach($C in $list){
 
$Path1 = "\\$C\C$\Windows\System32\Macromed\Flash\Flash.ocx"
$Path2 = "\\$C\C$\Program Files\Notepad++\notepad++.exe"
$Path3 = "\\$C\C$\Program Files (x86)\Notepad++\notepad++.exe"
$Path4 = "\\$C\C$\Windows\System32\Macromed\Flash\FlashUtil*.exe"
 
    
        get-childitem -File $path1 -ErrorAction SilentlyContinue | Select DirectoryName, @{label="File Name";expression={$_.versioninfo.OriginalFilename}}, @{label="ProductVersion";expression={$_.versioninfo.productversion}}
         
         
        get-childitem -File $path2 -ErrorAction SilentlyContinue | Select DirectoryName, @{label="File Name";expression={$_.versioninfo.OriginalFilename}}, @{label="ProductVersion";expression={$_.versioninfo.productversion}}
         
 
        get-childitem -File $path3 -ErrorAction SilentlyContinue | Select DirectoryName, @{label="File Name";expression={$_.versioninfo.OriginalFilename}}, @{label="ProductVersion";expression={$_.versioninfo.productversion}}
 
        get-childitem -File $path4 -ErrorAction SilentlyContinue | Select DirectoryName, @{label="File Name";expression={$_.versioninfo.OriginalFilename}}, @{label="ProductVersion";expression={$_.versioninfo.productversion}}
}
 
$export | Export-Csv C:\Temp\export.csv -NoTypeInformation

Of course, the above needs some snooping first. You have to know exactly what directory this software is installed on.

The script exports the UNC path into a csv sheet, showing the File name and the Product Version that is installed. This is basically the powershell equivalent of right-clicking an app, going to Properties and then the details tab:

Example:

$Path = "C:\Program Files (x86)\Adobe\Acrobat Reader DC\Reader\AcroRd32.exe"
 
    if (test-path $path){
 
        get-childitem -File $path -ErrorAction SilentlyContinue | Select DirectoryName, @{label="File Name";expression={$_.versioninfo.OriginalFilename}}, @{label="ProductVersion";expression={$_.versioninfo.productversion}}
         }
        else{
 
        Write-Host "Path is unreachable" -ForegroundColor Red
     
 
        }

[PS][DOS] Quick SCCM patching status

All you SCCM riders roll your eyes whenever it’s patching week. Not just because of worrying what the compliance will be like, but also because it gets tedious to check what KBs got installed and what didn’t seeing that there’s at least 40 KBs downloaded for servers and likewise for desktops.

Essentially, you’d remote into a server, check Software Center and see what failed, then you’d check the logs, etc.

I got my servers pretty cleaned up so I usually just get one or two servers with failed installation that is easy to fix. Logging into them is okay and checking but I prefer the good ol’ cmd.exe and powershell.exe talking to me instead.

STEP 1

You’re obviously going to check what’s installed on the server and when. Seeing that powershell reads the timestamp in a funky way, which results in not showing the dates properly, I prefer to run this command in cmd.exe:

wmic /node:'SERVER1' qfe where "InstalledOn like '10/%/2019'" GET description, hotfixid,
 installedon

The above checks a remote server for KBs installed only during October 2019 and will should you exactly which date it was installed. Let’s say we got 4 updates for our server that are installed. We’ll run the same for SERVER2 and now we know what KB as do both have.

STEP 2

Sure, now you can filter your Security Updates Group and sniff out the updates needed for SERVER1, but SCCM already did the scan and downloaded what SERVER1 needed in the ccmcache. And sure, you can try the UNC remote or actual logon to the server but there’s a faster, cleaner way.

$list = Get-Content C:\Temp\list.txt
 
foreach($PC in $list){

$path = "\\$PC\C$\Windows\ccmcache"
 
if(Test-Path -Path $path){
 
Get-ChildItem -Path $path -File -Recurse | Select Directory, PSchildName, LastWriteTime, Extension | Sort-Object LastWriteTime -Descending
}
  
Else{
Write-Host "Path or Asset is unreachable" -ForegroundColor red
}
}

The above will test the connectivity to the path (or server) and if it’s true, then it will show you the contents of the ccmcache folder sorted by latest downloaded (created on).

This way by comparing the CMD.exe list we ran in Step 1 and this list, we’ll know what’s missing for both Servers and this way we might notice a pattern if there’s a bugger KB that is acting hard to get.