Friday, April 29, 2011

Powershell load all functions into powershell from a certain directory

Suppose you're a system admin who uses powershell to managa a lot of things on his/her system(s).

You've probably written a lot of functions which do things you regularly need to check. However, if you have to move around a lot, use different machines a lot and so on, you'd have to re-enter all your functions again and again to be able to use them. I even have to do it every time I exit and restart Powershell for some reason, as it won't remember the functions...

I've written a function that does this for me. I'm posting it here because I want to be certain it's foolproof. The function itself is stored in allFunctions.ps1, which is why I have it excluded in the code.
The basic idea is that you have one folder in which you store all your ps1 files which each include a function. In powershell, you go to that directory and then you enter:

. .\allFunctions.ps1

The contents of that script is this:

[string]$items = Get-ChildItem -Path . -Exclude allFunctions.ps1
$itemlist = $items.split(" ")
foreach($item in $itemlist)
{
    . $item
}

This script will first collect every file in your directory, meaning all non-ps1 files you might have in there too. allFunctions.ps1 will be excluded.
Then I split the long string based on the space, which is the common seperator here. And then I run through it with a Foreach-loop, each time initializing the function into Powershell.

Suppose you have over 100 functions and you never know which ones you'll need and which you won't? Why not enter them all instead of nitpicking?

So I'm wondering, what can go wrong here? I want this to be really safe, since I'm probably going to be using it a lot.

From stackoverflow
  • Functionality wise I think there are a couple of ways you could improve your script.

    The first is that your script is dependent upon the name of the script not changing. While I don't think it's likely you'll change the name of this script, you never know what mood you'll be in a few years from now. Instead why not just calculate the name of the script dynamically.

    $scriptName = split-path -leaf $MyInvocation.MyCommand.Definition
    

    The next problem is that I believe you're split function will fail if you ever place the directory in a path which contains a space. It will cause a path like "c:\foo bar\baz.ps1" to appear as "c:\foo", "bar\baz.ps1". Much better to remove the split and just use the enumeration by the get-childitem command.

    Also you are taking a dependency on the current path being the path containing the scripts. You should either make that an explicit parameter or use the path containing the allFunctions.ps1 file (i prefer the latter)

    Here is the updated version I would use.

    $scriptName = split-path -leaf $MyInvocation.MyCommand.Definition
    $rootPath = split-path -parent $MyInvocation.MyCommand.Definition
    $scripts = gci -re $rootPath -in *.ps1 | ?{ $_.Name -ne $scriptName }
    foreach ( $item in $scripts ) {
      . $item.FullName
    }
    

    From a security standpoint you have to consider the possibility that a malicious user adds a bad script into the target directory. If they did so it would be executed with your allFunctions.ps1 file and could do damage to the computer. But at the point the malicious user has access to your file system, it's likely they could do the damage without the help of your script so it's probably a minor concern.

    Joey : He seems to like using strings for everthing in PS :)
    WebDevHobo : Thanks for the help, you mentioned some good point there. And indeed a hacker that got in will probably do loads more damage anyway. Also, Johan, it's not that I like strings in PS, it's more that I like setting the variable type sothat when a user enters a number where a string should be, an error will be thrown instead of the script still trying to do something with it.
    Joey : Get-ChildItem will return a collection of FileInfo/DirectoryInfo objects. No way you could try to use a number there instead.
    WebDevHobo : And there's also no way I could have used the string.split() function on it ;)
  • Include them in your PowerShell profile so they will load automatically every time you start PS.

    Look at Windows PowerShell Profiles for more info about where to find your profile script.

    PS defaults your profile to your "My Documents" folder. Mine is on a network drive, so anywhere I login, PowerShell points to the same profile folder.

    WebDevHobo : Awsome stuff, didn't know that about Powershell.
  • You can do this in a simpler way. I have this in my profile:

    ##-------------------------------------------
    ## Load Script Libraries
    ##-------------------------------------------
    Get-ChildItem ($lib_home + "*.ps1") | ForEach-Object {& (Join-Path $lib_home $_.Name)} | Out-Null
    

    Where $lib_home is a folder that stores scripts that I want to auto include. In this case it executes them. So I have the scripts define global functions. You could also dot source them (replace "&" with ".").

0 comments:

Post a Comment