Using SharePoint Lists and Powershell to populate content pages.   Leave a comment

   This article assumes you can create script and get to the point where you have a sharepoint site and a list object to work with.  It is not a complete script, but contains complete SharePoint API element functions you need to use to complete your own script with very little extra work on your part.

The Issue:

Needing to create 600 ish pages of content, from a CSV file of links descriptions and images.


Page layouts are identical, doesn;t require hundreds of columns of data to describe the page.  Each set of links on each page are different, but critically, all links, image paths etc are in the CSV datafile. 

Then…the requirement to update the content became one that needed some controls…lacking in a CSV file! 

I suggested using a SharePoint list – custom columns etc, – secure by default – , and then wrote this little beauty to go create all the pages.   I REALLY Love PowerShell! – Oh and the SharePiont API!

Script requirements:

  • Import Content to create pages from SharePoint List.
  • Create Web Part Page – do a loop of your choice to create each page, but perform the following ops on each page once you have created it.
  • Clear Title Image and change description in the web page title bar webpart (part of page creation).
  • Create Construct content and populate a CEWP (Content Editor Web Part).
  • Populate pages and commit to SharePoint Document library.

Script bits:

Import required page contents from SharePoint list in a simple array (I like simple:)):

# Create SimpleListArray

$inputDataArray = @()

$SPList = $web.Lists[$iDataSiteList]

$SPItems = $SPList.Items

I cheat and use the following construct to create an empty array Item:

foreach ($item in $SPListItems){

$arrayItem = @()
$arrayItem = “” | select <FieldName1>, <FieldName2>, …etc, etc…
$arrayItem.<FieldName> = $item.[“<FieldName1>”]
# etc…
# Populate into DataArray object.
$inputDataArray += $arrayItem

The result should be a nice simple array that you can use when creating your pages…

Create Web Part Page:

So for each page array Item we need to go create the page alter the header and set description and add CEWP and populate.

This should be a function btw, that also calls all the other functions, so you have a loop: (Create page -> Create CEWP Content -> add CEWP and content) – rinse and repeat for the next array item.  Hope that makes sense, email me if not!

$Global:list = $iWeb.Lists[“$library”];

$Global:iDataArray = $iDataArray

foreach ($PageItem in $iDataArray)

$title = $iDataArray.<TitleField>

# Batch XML required to create page and update TitleBar Web Part, in this case not setting an image.

“<?xml version=`”1.0`” encoding=`”UTF-8`”?>
<SetList Scope=`”Request`”>$($list.ID.GUID)</SetList>
<SetVar Name=`”Cmd`”>NewWebPage</SetVar>
<SetVar Name=`”ID`”>New</SetVar>
<SetVar Name=`”Type`”>WebPartPage</SetVar>
<SetVar Name=`”WebPartPageTemplate`”>2</SetVar>
<SetVar Name=`”Title`”>$title</SetVar>
<SetVar Name=`”HeaderImage`”></SetVar>
<SetVar Name=`”Overwrite`”>true</SetVar>
# Create Page.


# Cast result so we can parse for result, for logging of further action.

[xml]$xCreatePageResult = $createPageResult

# Now check for errors, and do what you need to do with them.

if( $xCreatePageResult.Results.FirstChild.Code -ne 0 )

{ # Do your “I went pear shaped” code for the error – I handle it by writing to a log and continuing.}

Create and populate a CEWP to the page:

# Create content based on business rules and content of array item field properties.

$content1 = “<Some HTML snippet. with $<variable for your field entry>>”
if ($<fieldName>) {$content2 = “<Some HTML Snippet”} 
$finalContent = $content1 + $content2, etc…

Edit the pages Title Bar Content and remove the default image:
Function Edit-Title-Bar-Content ($iPageURL)
# Get Web part manager and set the web part id to fiddle with
Open-Site $SiteURL
$webpartmanager=$web.GetLimitedWebPartManager($iPageURL, [System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)

for ( $i=0; $i -lt $webpartmanager.WebParts.count; $i++ )
if ($webpartmanager.WebParts[$i].Gettype() -eq [Microsoft.SharePoint.WebPartPages.TitleBarWebPart])
$webpart = $webpartmanager.webparts[$i]
$webpart.Image = “”
$webpart.Title = $title
$webpart.HeaderTitle = $title
$webpart.ChromeType = [System.Web.UI.WebControls.WebParts.PartChromeType]::None;
# Cleanup

} # End Function.

Populate page – Add Web Part  and Content to page in SharePoint Document library:
Function Add-ContentEditorWebPart($SiteURL, $pageUrl, $webpartzone, $index, $title, $content)
$webpartzone = "Header"
# Open-Site $SiteURL
$webpartmanager=$web.GetLimitedWebPartManager($pageUrl, [System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)

$webpart = new-object Microsoft.SharePoint.WebPartPages.ContentEditorWebPart
$webpart.ChromeType = [System.Web.UI.WebControls.WebParts.PartChromeType]::None;
$webpart.Title = $title

$docXml = New-Object System.Xml.XmlDocument
$contentXml = $docXml.CreateElement(“Content”);

$webpart.Content = $contentXml;
$webpartmanager.AddWebPart($webpart, $webpartzone, $index);

} # End Function.


Posted September 30, 2011 by stormwalker255 in SharePoint

PowerShell – turn on a Christmas tree (or device of choice)!   Leave a comment

This one’s about how to use a £5 USB LED Christmas Tree, a £30 k8055 experimimenters board, and a bit of code wrap of a C++ DLL with .Net c# code to make an Alert “light” work from PowerShell..

The commands are: tree-on & tree-off.

It gives our team a lovely visual clue when SharePoint is feeling off-colour. – We’re switching it other way around for Christmas (switching it off when there’s a problem!)

You know with ULS logs, there is so much information in there, and if you’ve read my article on how to retrieve the Critical Entries in an almost real time fashion, then you’re almost there…this goes really well with that previous script.


For the k8055 board you’ll need to use the below code slap it in a Visual Studio of some description and compile it as a DLL.

I’ve not tried any open source compilers, but don’t see why they wouldn’t work…

It doesn’t work under 64 bit PowerShell, which I could do with some help to solve. 
I think it’s because the underlying C++ DLL is 32bit, but, I have been told that really shouldn’t make a difference…happy to learn if different – I’m a code by the seat of your pants kind of guy!

k8055 wrapper DLL Code:
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Text;
using System.Runtime.InteropServices;

namespace Dansk8055
public partial class Methods
// Device
private bool[] devs = { false, false, false, false };
private int crtDev = -1;

// Digital
private int[] dbtime = { 0, 2, 10, 1000 };
private enum enmSigTypeD { off, cycle, cup, cdn };
private enmSigTypeD sigTypeD =;

private int stateDI, stateDO, c1, c2;
private bool doChanged;

// Analogue
private enum enmAOSigType { off, ramp, saw, sine, i1, i1i, i2, i2i };
private enmAOSigType[] sigTypeA;
private int ai1, ai2, ao1, ao2;
private int ao1idx, ao2idx, ao1Delta, ao2Delta;
private bool ao1Changed, ao2Changed;

private int[] ramp, saw, sine;

// ============================================================
// API ========================================================
// ============================================================

// Following wraps K8055D.DLL
// (K8055D_C.DLL is required for .net DLL, as used in K8055_test_D)
// *** have not checked if K8055D_C.DLL can be used here as well ***
public sealed class api // DLL Wrapper
#region API Declares

public static extern int OpenDevice(int devNumber);

public static extern void CloseDevice();

public static extern int ReadAnalogChannel(int Channel);

public static extern void ReadAllAnalog(ref int Data1, ref int Data2);

public static extern void OutputAnalogChannel(int Channel, int Data);

public static extern void OutputAllAnalog(int Data1, int Data2);

public static extern void ClearAnalogChannel(int Channel);

public static extern void SetAllAnalog();

public static extern void ClearAllAnalog();

public static extern void SetAnalogChannel(int Channel);

public static extern void WriteAllDigital(int Data);

public static extern void ClearDigitalChannel(int Channel);

public static extern void ClearAllDigital();

public static extern void SetDigitalChannel(int Channel);

public static extern void SetAllDigital();

public static extern bool ReadDigitalChannel(int Channel);

public static extern int ReadAllDigital();

public static extern int ReadCounter(int CounterNr);

public static extern void ResetCounter(int CounterNr);

public static extern void SetCounterDebounceTime(int CounterNr, int DebounceTime);

public static extern void Version();

public static extern int SearchDevices();

public static extern int SetCurrentDevice(int lngCardAddress);


// ============================================================
// DEVICE =====================================================
// ============================================================

private bool devForceCurrent(int devNumber)
// Set current device returning true/false
// (coded to deal with USB insert/removal)
if (api.SetCurrentDevice(devNumber) != devNumber)
if (api.OpenDevice(devNumber) != devNumber)
return false;

crtDev = devNumber;
return true;

private bool devSetCurrent(int devNumber)
// Set current device returning true/false
// (coded to deal with USB insert/removal)
if (devNumber == crtDev) return true; // Avoid redundant calls if poss
return devForceCurrent(devNumber);

private bool devExists(int devNumber)
return devForceCurrent(devNumber);

private int devCount()
// Relies on deviceExists to try to count devices

int count = 0;

if (devExists(0)) count++;
if (devExists(1)) count++;
if (devExists(2)) count++;
if (devExists(3)) count++;

return count;

// ============================================================

public int devOpen(int devNumber)
return api.OpenDevice(devNumber);

public void devClose()

private void devInit()
stateDI = stateDO = 0;

ai1 = ai2 = ao1 = ao2 = 0;
ao1Delta = ao2Delta = 1;
ao1idx = ao2idx = 0;
doChanged = ao1Changed = ao2Changed = false;

public void devSetDigitalChannel(int Channel)

public void devSetAllDigital()

public void devClearDigitalChannel(int Channel)

public void devClearAllDigital()

public void devSetAllanalog()

public void devClearAllAnalog()
// ============================================================
// ============================================================
// ============================================================

public bool devConnectX(int devNumber)

if (crtDev > -1)

crtDev = devOpen(devNumber);

return true;

public bool devConnect(int devNumber)

if (devSetCurrent(devNumber) == false)
return false;
return true;




Load remote assembly.

$dllFullPath = "\\<server>\scripts$\k8055\Dansk8055\Dansk8055\bin\Debug\dansk8055.dll"

$evidence = [System.Reflection.Assembly]::GetExecutingAssembly().Evidence

[System.Reflection.Assembly]::LoadFrom(“$dllFullPath”,$evidence) > $null

Function Global:Tree-On() {

# Relies on library DLL being called first by calling script.

$outt = New-Object dansk8055.Methods

$outt.devOpen(0) > $Null

$outt.devConnect(0) > $Null



} # End Function.

Function Global:Tree-Off() {

# Relies on library DLL being called first by calling script.

$outt = New-Object dansk8055.Methods

$outt.devOpen(0) > $Null

$outt.devConnect(0) > $Null



}# End Function.

Posted September 30, 2011 by stormwalker255 in SharePoint

Tweet with PowerShell   Leave a comment

I wanted to type “tweet <Blah>” from a Powershell session and have it appear in my tweet stream, nothing fancy right?

 – Mine has to go through an authenticating proxy, so there’s a bit of script for that also, you could store the credentials in file, which saves typing them in, but that’s for another time.

OK, so here’s how you do that.

There are proably a hundred way to do this, here’ s the one I used.

(I will be adding a suitable code display add-in, once someone shows me how!)

So – first step, slap this code in your $profile (notepad $profile) for a PS prompt.

Second step is after the script, but you need to do that first to get your twit codes which you will need to *insert* into this script.  You also will need to download the API assemblies and make them available in your script dir, I’ll provide links.

[Reflection.Assembly]::LoadFile(“c:\Scripts\DLLs\Twitterizer2.dll”) > $null
[Reflection.assembly]::LoadFile(“c:\Scripts\DLLs\Newtonsoft.Json.dll”) > $null
 $cred = Get-Credential
$tokens = New-Object Twitterizer.OAuthTokens;
$options = New-Object Twitterizer.StatusUpdateOptions
$proxy = New-Object System.Net.WebProxy
$proxy.Address = “
$proxy.Credentials = $cred.GetNetworkCredential()
# $proxy.UseDefaultCredentials = $true
$options.Proxy = $proxy
$tokens.AccessToken = “***************your access token stuff****************”
$tokens.AccessTokenSecret = “***your access token secretf*********************”
$tokens.ConsumerKey = “*********your consumer key *****”
$tokens.ConsumerSecret = “****************your secret ****************”
 function Tweet($cheese)
$global:f = [Twitterizer.TwitterStatus]::Update($tokens,$cheese,$options);
                if ($($result = $f.Result) -eq $result.Success)
Write-Host “Successful Tweet…”
} else
Write-Host “Fail, Epic..Fail Whale…no Tweet…”

Posted September 16, 2011 by stormwalker255 in Uncategorized

SharePoint ULS Critical Events – How to display them as they happen (almost) in a PowerShell Session window   Leave a comment

This will probably be quite a long post, you may want to grab coffee and cake! – Its aimed at people with intermediate to advanced Powershell knoweledge.

Monitoring our Critical ULS logs – no errors!

I’m a working techie guy who bolts Powershell together as I need it, so it comes from all over.  If you want attribution for a block of code I borrowed from the internetz that came from your blog or article some place, please do let me know and I’ll be delighted to add attribution – if I remember where I got something, I’ll attribute as I go. 🙂

As the saying goes, you can’t manage what you can’t see. – So now we *can* see! 🙂


I want reasonably immediate information about how SharePoint is performing, especially Critical Events – I want  the detail to be available to me to action ASAP.   Being a bit of a Powershell fan, I wanted to see any Critical Events that were happening to SharePoint, pop up in a Powershell session window. – And, which is most important, I want to light up a Christmas tree when I get a Critical Event!!

I can’t give you entire working scripts here, they belong to my employer.  I can however give code examples to point you in the right direction in terms of script block elements I wrote/borrowed to get this information displaying in a Powershell session window.  You’ll need to create scripts with these elements and add your own support code.  Please email me if you need assistance, time willing, I’ll do my best to help you.

Note: you can ignore script 1, if in script 2 you can define which logs you want to go look at.  Script 1 gives nice summary views of 24hrs worth of logs in HTML also, but if you don’t need that, skip to script 2 for Powershell displaying goodness…

Script 1:  Measure-ULS-Log-Events-Incidence.ps1


About a year or so ago, I wanted a html output of the number of types events happening within the ULS logs on our farm over 24hrs, to give me a clue about what was going on without me having to sit with a log viewer open 24×7 across a number of machines.


You need to back up the latest ULS logs files from your farm, from each server.

I then used MS LogParser to iterate through the logs for the information we need, in this case, just numbers of each event type per server.

foreach ($server in $servers) {

&$logParserPath\LogParser -i:tsv -o:tsv file:\\<server>\scripts$\logparser\mossukoverviewstats-<server>.sql?COMPUTER_NAME=$server


Where the above referenced sql query file contains the following:

SELECT  Area    AS [Area],

   Category     AS [Category],

   Level        AS [Level],

   COUNT(*)     AS [Count]

INTO %COMPUTER_NAME%-area-category-level.tsv


GROUP BY Area, Category, Level



These get saved out as files by log parser for us to use in a minute.

 # Get the last few files
 $logfiles = ls -r -fi *.txt | sort @{expression={$_.Name}},@{expression={$_.LastWriteTime};Descending=$true} | select Directory, Name, lastwritetime | Group-Object Name | %{$_.Group | Select -first 1}
 $logcount = $logfiles.count
 Write-Debug “Log Count: $logCount”
 # Create an Array List Object to hold results data
 $global:results = new-object system.Collections.ArrayList

# Check and make sure at least three logs to compare with
 if ($logcount -lt 3) {
 Write-Host “Not enough logs written to make a comparison – waiting until next run” -ForegroundColor Yellow;
  } Else {
   Foreach ($log in $logfiles) {
    $logFile = $log.Name
    Write-Debug “File construct : $logFile”;
    $fileData = Import-CSV $logFile
    $temp = $fileData
    $temp = add-member -input $temp -passthru -member noteproperty -name prop -value 5
    # Using .NET object
   # Display file counts over time
   #$global:results | ft
   $results | ConvertTo-Html | Out-File $htmlSummaryOutput
   $Global:TotalCountResultsArray | ConvertTo-Csv | Out-File $csvSummaryOutput

You’ll next need a function to count these events from these files, these are then output into a unique time stamped file

# Set $now to current time/date
$now = [datetime]::Now

# Convert $now to ticks
$now = $now.ticks

$Global:TotalCountResultsArray = @()
 # Declare vars at the outer most scope they are required, else things go weird.
 $critstat_total = 0 ; $medstat_total = 0 ; $unexstat_total = 0 ; $highstat_total = 0 ; $monistat_total = 0
 foreach ($server in $servers) {
   $Global:CRA = @()
   $Global:CRA  = “”  | Select Server, Critical, Medium, Unexpected, High, Monitorable
   $Global:CRA.Server = $server
  $script:statsFile = Import-Csv -Delimiter `t $server-*.tsv;
  Write-Host “$($statsFile.count)” -ForegroundColor Green
  $critical_count = 0 ; $medium_count = 0 ; $unexpected_count = 0 ; $high_count = 0 ; $monitorable_count = 0
  foreach ($stat in $statsFile) {
   if ($stat.Level -eq “Critical”){[int]$critical_count =  $stat.Count; $Global:CRA.Critical = $critical_count   };
   if ($stat.Level -eq “Medium”){[int]$Medium_count = $stat.Count; $Global:CRA.Medium = $Medium_count };
   if ($stat.Level -eq “Unexpected”){[int]$unexpected_count = $stat.Count; $Global:CRA.Unexpected = $unexpected_count };
   if ($stat.Level -eq “High”){[int]$high_count = $stat.Count; $Global:CRA.High = $high_count };
   if ($stat.Level -eq “Monitorable”){[int]$monitorable_count = $stat.Count; $Global:CRA.Monitorable = $monitorable_count };
   Write-Host `n
   # Write-Debug $stat
   Write-Host “Line count field: $($stat.Count)” -Foregroundcolor Green
    Write-Host “================ “;
   Write-Host ” “;
   Write-Host “[$server]: Critical Count  : $critical_count”
   Write-Host “[$server]: Medium Count   : $Medium_count”
   Write-Host “[$server]: Unexpected Count: $unexpected_count”
   Write-Host “[$server]: High Count      : $high_count”
   Write-Host “[$server]: Monitorable Count: $monitorable_count”
   if ($critical_count -ne $Null) {$critstat_total = $critstat_total + $critical_count; $servercrit = $server
           “<tr><td>$Server</td><td class = “”thresholdhigh””>Critical ULS Events</td></tr> ” >> \\<server>\wwwroot$\console\alertconsole.htm }
   if ($medium_count -ne $Null) {$medstat_total = $medstat_total + $medium_count}
   if ($unexpected_count -ne $Null) {$unexstat_total = $unexstat_total + $unexpected_count}
   if ($high_count -ne $Null) {$highstat_total = $highstat_total + $high_count}
   if ($monitorable_count -ne $Null) {$monistat_total = $monistat_total + $monitorable_count}
   # Add to global results array
   $Global:TotalCountResultsArray += $Global:CRA

 Write-Host “`nTotal of Stats: “;
   Write-Host “================ “;
   Write-Host ” “;
   Write-Host “Critical Count  : $critstat_total”
   Write-Host “Medium Count   : $medstat_total”
   Write-Host “Unexpected Count: $unexstat_total”
   Write-Host “High Count      : $highstat_total”
   Write-Host “Monitorable Count: $monistat_total”
   # Write the count to file / screen
   “$Now, $Critstat_total, $medstat_total, $Unexstat_total, $Highstat_total, $Monistat_total” >> $now-measuring.txt
   Write-Host “$Now, $Critstat_total, $Medstat_Total, $Unexstat_total, $Highstat_total, $Monistat_total” >> $now-measuring.txt

The next bit involves loading up these files and working out distribution of types of events, and outputting as HTML.

This script was modified lastly to spit out a CSV files, so script 2 had something easy to work with.


Script 2:  Watch-ULS-Display-New-Critical-Entries.ps1

Set a file watcher to see when scripts 1’s output CSV file is updated (if you are not bothering with script 1 – which flags when a critical event has happened), then you need to script a way to decide when you are going to parse the ULS logs).

# Create file system watcher.
 $watcher = New-Object System.IO.FileSystemWatcher
 $watcher.InternalBufferSize = 65536
 $watcher.Path = $iSearchPath

# Setup which file / directory to watch.
 if ( $iSearchFile )
  Write-Host “`nLooking for file changes only in: [$iSearchFile] in Path [$iSearchPath] filter…” -ForegroundColor Yellow;
  $watcher.Filter = $iSearchFile
 $watcher.IncludeSubdirectories = $true
 $watcher.EnableRaisingEvents = $true

 # Subscribe to the events generated by the $watcher.

$changed = Register-ObjectEvent $watcher “Changed”  -Action $changeAction
 $created = Register-ObjectEvent $watcher “Created” -Action {
     write-host “Created: $($eventArgs.FullPath)” -ForegroundColor Green; $iFile}
 $deleted = Register-ObjectEvent $watcher “Deleted” -Action {
     write-host “Deleted: $($eventArgs.FullPath)” -ForegroundColor DarkYellow; $ifile}
 $renamed = Register-ObjectEvent $watcher “Renamed” -Action {
     write-host “Renamed: $($eventArgs.FullPath)” -ForegroundColor Red; $iFile }

The $change script block defines your code for checking the CSV file and then looking through the ULS logs, in my case it recurses and checks each entry of the CSV file which gives server, and an integer for each event type.

For critical events, if it finds a number, we retrieve the server(s) value(s) name (server names) that is also in the same item.

# $change action definition.

$change = {code}

Once you know which server(s) ULS log(s) the event(s) is/are in, you can then go get that/those log(s), script 1 has already pulled the current ULS logs over to temp storage, so you can get it/them from there, or you’ll need to provide another way if script 1 is not being considered.

OK, the next thing you’d possibly want to do, is iterate through the ULS log(s).  I thought a bit about this. ULS logs can be big, very big, or titchy.  Potentially get-content “could” run out of memory to manipulate the file.  Lets not take a chance, we’ll use a streamreader thingie instead. (I think PS2 can use get-content with streamreader type behaviour)

$reader = New-Object System.IO.StreamReader($iFile)
        $regExMatch = “Critical”
        # Find the droids we are looking for…
        Write-Host “Finding Entries…” -ForegroundColor Yellow
        While ( !$reader.EndOfStream )
        $line = $reader.ReadLine();
          If ( $line -match $regExMatch )
          # These are the droids we are looking for…..
          Write-Host “These are the Droids we are looking for!” -ForegroundColor Red
          Write-Host ” [$server]: ” -NoNewline -ForegroundColor Yellow
          Write-Host $line -ForegroundColor Green -BackgroundColor Black;
          # Switch on our Tree.

That little lot parses through our ULS log, matches lines that have Critical in them, and spits out the output into our Powershell window.

Don’t forget a reader.dispose() when tidying up between logs btw 🙂

– My “Tree-On” is a nifty bit of powershell connecting to a custom DLL, via a k8055 interface board – this switches on an actual  “Alert Christmas Tree” – no expense has been spent!! (Well apart from the fiver it cost for the USB tree, and the time taken to lop the USB connector off and connect to the card outputs!)

Other bits, as we were testing the script, and re-running it, I wanted to make sure to cancel events we had registered.

This command cancels any active event subscription, and tells you what it cancels:

Get-EventSubscriber | % {Write-Host ” Cancelling Event Name: [$($_.EventName)] – Subscription ID: [$($_.SubscriptionID)]”;Unregister-Event -SourceIdentifier $_.SourceIdentifier -force }


Ok, I think thats cogent enough to let you figure the rest out, but if not, please do let me know as I’d like the community to get some use from all this 🙂


Posted September 7, 2011 by stormwalker255 in SharePoint

Tagged with

Usage Data in SharePoint – PowerShell retrieval – Last Accessed Date for a Site Collection   1 comment

This is  a three line(ish) method of giving Powershell love to the SharePoint 2007 Usage Stats you need to get “Last Accessed Date” from your site collection…It works for me, if its wrong or you have comments, please do leave me a note 🙂

For me the use of this is to give me a “Last Accessed Date” for a site collection, as I couldn’t see (I didn’t look far) another way of quickly getting that information from the SharePoint API.

$web.Usage doesn’t cut it for us as we have stuff that modifies the site running in the background, and it doesn’t provide a last access date anyways.

So, let on with the good stuff…I’m assuming you have loaded the SharePoint main and Administration assemblies…and have a Site Collection attached to $web

$ReportType = [Microsoft.SharePoint.Administration.SPUsageReportType]::url
$ReportPeriod = [Microsoft.SharePoint.Administration.SPUsagePeriodType]::lastMonth
$usage = $web.GetUsageData($ReportType, $ReportPeriod)

Wibble through all the site URLS with a loop of your choosing…then do a sort on the date to get the last one.  Peasey!

If you do a lookup of the ReportType methods you can get lots more information out from the usage statistics.

Not checked against 2010, but works fine in 2007, so all things being reasonably equal, should work in 2010 also. 

Resulting object coming back is a DataRow.

I ferttle through that with a:

foreach ($row in $usage.PSBase.Rows) {<your stuff here>}  as for me thats the easiest way of getting the Most Recent Day column data on a URL.

thus: $thisRecentDay =  $row.”Most Recent Day”

Personally I stuff that into an array, loop to collect ’em all and do a sort.

Then I figured I was in a world of hurting, as I had sites with UK, and sites with US date formats, deepness of joy!


try {

$Global:ordered = $Global:mostRecentDayArray | Sort-Object {[DateTime]::Parse($_.Date)}

} # Should sort your local culture stuff.


$Global:ordered = $Global:mostRecentDayArray | Sort-Object {[DateTime]::ParseExact($_.Date,”Mdyyyy”, [System.Globalization.CultureInfo]::InvariantCulture)}
 }  # Should take care of other stuff…change your M d and yyyy’s as required.

That’ll take care of the date parse.

Once sorted, just grab the last date, and you have the last accessed date for your site collection…

$Global:lastDataAccessed = $Global:ordered[$Global:ordered.count-1] (you’ll need add another line and a logic check to see how many entries are in the array…

Handy for finding last access information for your sites stats page!


TinyTic on the Linux BBS   Leave a comment

Well, BBS’ing is still in my system, and I know, really…I should be over it by now.

There is just something about running old fashioned command line stuff that I love.

So..last night spent a couple of hours, well five to be precise, fixing a bug in a little utility called TinyTic.

TinyTic is a file tosser, it takes incoming files from file networks and with some help from their *.tic description file, moves them to the correct Area/ directory

on the BBS so people can see them to download – only in this case – it didn’t quite do what it said on the tin.

It was one of those mad bad issues that nearly drove me from the product and into searching for another.

I stuck with it, and the great thing with soureforge stuff is, that you get to play if it doesn;t work the way you want.

Only a small bug, the program kept complaining it couldn’t find its config file when attempting file moves, and failed to move the files. It knew where the files were to move, which meant it must have opened the config file initially. My suspicion, eventually founded, was that once it loaded it switched to the inbound data directory where the files were being held, and then tried to load the config file again without path sourcing it….yay!
So five hours of looking and a few lines of code had this handy Tiny TIC file processor running happily on my linux BBS box. My Synchronet BBS now has a new working tool to import files coming in on my file feed, and I’m a happy c++ bunny!

If you are trying to get TinyTic working under linux then this is the code you’ll need to change:

Open up tprocess.cpp in your favourite editor.

Find the CheckArea function:

Go edit the first section so it looks like this;

— start of change

bool CheckArea(const char *area_name, string &Destination)


fstream fConfig;

char    szTmpBuffer[ BUFFER_MAXLEN ] = “\x0”;

string  Buffer;

string  Log;

bool    AreaExists = false;

Buffer = CurrentDirectory.c_str();
#if defined(__TINYTIC_WINDOWS) || defined(__TINYTIC_OS2)

Buffer += “\\”;

#elif defined(__TINYTIC_POSIX)

Buffer += “/”;

Buffer += szConfigFile.c_str();

— snip end of change

You’ll see the first few lines are as currently in the code, then we change the first buffer entry to CurrentDirectory (which is populated with the directory name of where you start the utility in the main tic code file) and in theory (I can’t test under DOS) enabled it to still work under DOS when compiled that way.

Compile, create binary – you shouuld be good to go!!


Posted May 11, 2011 by stormwalker255 in BBS Stuff

Tagged with

SharePoint 2007 – Monitor Incoming Email with PowerShell   Leave a comment

EDIT: Ages later……I’ve still not done the public version of this script, its around 1.5k lines in production and bah.   If you really really want this, let me know and I’ll make the effort. 

Well it’s a big script of about 1000 lines or so, but I like it, and it works, and I finished it today, v1.0 beta at any rate.

What does it do, and how does it do it?

Well, it ferckles through the IIS SMTP Logs and the SharePoint ULS logs (with E-Mail logging turned to max – naturally!) and gives you:

HTML Reports

Top e-mailers per X hours (where you decide X).
List of e-mails incoming and both senders and recipients.
Exchange server list (which internal servers are forwarding emails to us).
List of total e-mails received and average incoming per hour!

EDIT: Oh yes – you can also see “errors” (note to self: which was the whole point really now wasn’t it?)

I’m working on a conversion to be able to post it here, and hopefully will have that done within the week!

Tech stuff:

It has: – Guest appearances by Microsoft LogParser.   File StreamReader to read open ULS logs.   Plenty of regular expressions, and a dash of “read from SharePoint config database” to get your SharePoint email addresses to go check against.

I’m toying with the idea of doing the final hop – reading back from the library the email address belongs with, to ensure the E-Mail Timer Job did do it’s thing correctly.  Hummm about that, not sure, and can’t remember if thats possible to construct from an initial config database read ( I know about the SharePointy API stuff) or it maybe needs something else…

Watch this space!


Posted January 19, 2011 by stormwalker255 in SharePoint

Tagged with