Follow-up HP Updated #PowerShell iLO Scripting Tools

This is a follow-up up to a previous post  “Using HP iLO Scripting Tools for Windows PowerShell“. HP released an updated version 1.1 of their iLO Power Shell tools on 20 Mar 2014 .

Before I jump into the new release and what’s included in it I want to cover something I received a couple of comments and pings via Twitter about. Readers have asked how to download the installer. I’ll go over that here and then again at the end, as upgrades have been made easier with this release.

First off, I have to start by saying that the HP website along with the EMC PowerLink site continually trade places in my mind as the most difficult to navigate and link to. What I mean is, they both contain so much info that I often find it hard to navigate and linking to content is even more hit or miss. I’m never sure if a deeplink will work next time I need it.

Enough sidebar, the HP iLO PowerShell can be downloaded after a four step procedure:

One:

visit the vanity url http://hp.com/go/powershell and click the red [DOWNLOAD] button.

HP Powershell update 1
Two:

you should see the HP Support Center page. From there, click View and download all drivers, software and firmware  for this product.

 

HP Powershell update 2

Three:

There are multiple version of the installer. Choose the one for your flavor of Windows 7/8/2012 in 32 or 64 bit:

HP Powershell update 3

Four:

After all that now you’re presented with the actual msi. Click [DOWNLOAD].

HP Powershell update 4

 

Run that MSI and let it do it’s thing. If you already have a previous version it will update the newest content and features.

Either way, when it’s finished pop open a new PowerShell session.

First lets check the version:

PS C:\>Get-HPiLoModuleVersion

Previously I wrote about the 1.0.0.0 release. As you can see this latest release is 1.1.0.0.

HP Powershell update ver 1.1

This latest release includes a few new CmdLets.

List the available CmdLets.

PS C:\>Get-Help *HPiLO*

Side-by side we can see the new additions:

HP Powershell update ver 1.1 - list

Previously there were 110 CmdLets available.

Throw a quick Measure-object on the pipeline and you can see a count includes 5 additional CmdLets.

HP Powershell update ver 1.1 - count

While Get-*HPiLo seems to contain the same number, there are 3 new Set-*HPiLO CmdLets

HP Powershell update ver 1.1 - count-get-set

 

There are also two new Update-*HPiLO Cmdlets are:

  Update-HPiLOFirmware

DESCRIPTION

The Update-HPiLOFirmware cmdlet copies a specified file to iLO, starts the upgrade process, and reboots the iLO after the image has been successfully flashed. You must have Configure iLO Settings privilege to execute this command.

This should server useful as a method of pushing out iLo updates via PowerShell.

The other update CmdLet is what I referred to earlier as making updates to the modules easier:

Update-HPiLOModuleVersion

The help for this command indicates that it will checks to determine if a newer version of the HPiLOCmdlets is available on the HP website for download. If there is it will open a new browser to download it.

HP Powershell update ver no new

Since this is the first release to include the update cmdlet there is currently nothing for it to do. I referred earlier to the navigation often required to find something on the HP website, and this will hopefully help.  What I would in fact prefer would be for the utility to actually download the msi directly from the commandline.

 

Down to something more concrete and useful:

Lets use these tools to take a look at the integrated Storage Controller.

In this case I found that the G7 DL380s I have in the lab do not respond, but I do have some G8s with an HP Smart Array P420i that answer right up.

$ld= Get-HPiLOStorageController -Server $iLOIP -user $usr -pass $pass

$ld

HP Powershell-StorageController-g8-1

From this view, we can see general controller status  in addition to the Logical Drive(s).

$ld.LOGICAL_DRIVE

If we dig into those the Logical Drive(s), this is what we can see about each:

HP Powershell-StorageController-g8-2

Physical Drive(s) details are also visible:

$ld.LOGICAL_DRIVE.Physical_Drive

HP Powershell-StorageController-g8-3Since publishing my first post, I’ve been chatting over Twitter with a PM @ HP about my experience with this, their first tool released for PowerShell. She assures me more PowerShell goodness is in the pipeline and I hope to try it out and share more with you here.

Have you given the HPiLo PowerShell tools a try yet? What are your thoughts? 

Along the some vein, I was inspired to try out the HP Insight Control – server provisioning tools.

Once upon a time I had tried taken a look but was discouraged by the initial installation process, including the time it would take to install an OS, the dependencies and then install the software components of  the Insight Control server provisioning tools.

I found that there is now a virtual appliance available  for either VMware (OVF) or for Microsoft Hyper-V (vhdx).  That said, at  a whopping 11GBs the download is not exactly slim! But, the time gained from not having to install and OS and al the dependencies should make it worth while.

I’ll try to post about my experience here.

One more Question:

Do you prefer the ease of vApps for downloading and running software solutions? Why or why not?

Let me know in the comments or over Twitter @kylemurley

 

Hoy jueves 12 setiembre 7pm PST #vBrownbagLATAM en español @virtualizecr & @PunchingClouds presentan intro a #VSAN bit.ly/BrownbagLATAM

Hoy jueves 12 setiembre 7pm PST #vBrownbagLATAM en español @virtualizecr & @PunchingClouds presentan intro a #VSAN http://bit.ly/BrownbagLATAM
ProfessionalVMware #vBrownBag LATAM
#vBrownBag en castellano una expansión de la plataforma ProfessionalVMware.com para crecimiento profesional y contribución a la comunidad VMware de habla hispana en América Latina, España y alrededor del mundo.vBrownbagLATAMLogo150

Day-1 of pre #PuppetConf Training: Puppet Fundamentals for System Administrators #Puppetize @PuppetConf

This is a summary of my experience on the first day of PuppetConf pre-conference training, Puppet Fundamentals for System Administrators.

Automation Awesome: Powered by Caffeine

Since my already late night flight was delayed, I didn’t get to the hotel until well after midnight, too late to eat anything that wouldn’t have kept me up the remaining 4 hours before my wake up call. When I did wake up I was famished and considered dipping out to find some type of breakfast but thought since meals are provided during the day I’d give it a go. On the way down to the location of pre-conf training check-in I happened to get on the elevator with the event planner for Puppet. I was nicely welcomed and told it’d be fine to go on in early to the area for dining. The spread there was not bad, but definitely light; chopped fruit, danishes, muffins, juice and plenty of coffee. I’ll be going out to breakfast tomorrow though.

Speaking of coffee, there was no coffee in training rooms or sodas for that matter, only water which does a body good, but lacks the essential element that power’s much of the awesome that IT makes happen: Caffeine! Since I had met the event planner in the morning, I mentioned during a break the lack of coffee in each training room, noting that even on their own PuppetLabs Training Prerequisites site ( http://bit.ly/17Yee5l ) they make a point of saying there should be good coffee available. Apparently the facilities charge for coffee in each training room (there are 10+) were prohibitive, so one central location was set up. For me this was unfortunately three floors away from the rooms where my training was taking place.

Who dat?

This being my first experience really meeting the Puppet community I am making and effort to seek out contacts and find out what brings them here and what they use Puppet for in their environments. At breakfast I met a fellow attendee who works for a large corporation in the south that sells stuff on TV… all kinds of stuff… (hint: they have three letters in their name…). We chatted about what brought him to the training and how much previous experience he’d had with Puppet. Turns out his company recently acquired a hosted infrastructure shop that was already running Puppet, so he was here to learn how the heck what they’d told him could be true. They said they didn’t really have an ‘operations team’ doing the work, only a sparse staff and Puppet. That was enough of a motivator to put him on a plane to SF for a week of learning.

Also at breakfast I bumped into our trainer, Brett Gray. Who is a friendly Aussie, who once in class introduced himself as a Professional Services Engineer who previously worked in R&D at PuppetLabs and before that he ran Puppet in production environments at a customer sites.

What it be?

Let me say this very clearly, this training is really well thought out! At the hotel we’re in there are multiple training sessions going all at the same time. In our particular room are about 15 people, including Brett and Carthik another Puppet employee who is here to assist as we go along. The quality of the content, the level of detail provided for the exercises and the software provided are all knit together perfectly. It is immediately evident that a significant amount of effort has be invested to ensure that there are minimal hickups and learners are not tasked with working around any technical hurdles that are not part of the primary learning outcomes. If you struggle with some part of an exercise, it is highly likely that that is exactly where they want you to experience the mental gymnastics, not an artifact of an unexplored issue.

This stated, I’ll refer back to my previous post  [I am Not a Developer] and say there are some minimal domain knowledge that you do want to have a grasp of.  Prior to arrival onsite an email was sent to attendees pointing to a post on the Puppet Labs Training site http://bit.ly/17Yee5l outlining the essential skills necessary to fully benefit from the training.

As the course pre-req’s indicated, it’s essential that attendee have some type of VM virtualization platform. I found the delivery mechanism to be ideal. PuppetLabs Training even goes to the effort of providing the customized VM in multiple formats compatible with nearly every major player, VMware Workstation, Player, Fusion, OpenVM, etc.. Standing up the VM, binding it to the appropriate network and opening access to it via a bridged connection were not explicitly covered as part of the course. Out of the dozen plus students in my session there was one student in particular who was really struggling at first. The staff was incredibly patient and helpful getting this person up and running without derailing the rest of the training. I have to admit I felt a little bad for them and at the same time a bit relieved that at least it wasn’t me!

The email also advised attendees to download the training VM which was version 2.8.1. When we got to class, of course some revs had happened and the release to web was already out of date. Fortunately version 3.0 of the training VM was provided to each of us on our very own Puppet USB, much better than all trying to download it over the hotel wifi or access a share from our personal laptops.

So, what did we do?

This 3-day course covers basic system config in master-agent mode. The day began with a brief intro to the company, their recent history and the product, including the differences between Puppet Enterprise and the open source version. We’re primarily working on the CLI but the GUI is used for some exercises such as signed certs and searching for package versions across all managed nodes.

That covered, we dove into orchestration, resource abstraction layer, the great thing about idempotency and many other reasons why Puppet is the future. Lectures and slides move along at a manageable pace with short exercises to get an immediate chance at applying the material. Next thing I knew it was lunch time. The food at lunch was much better than breakfast. I met several more attendees from places as far away as Korea and Norway. We discuss how the make up of attendees really runs the gamut from large HPC federal contractors, to electronic publishers for the education industry, to large and medium enterprise web hosting all the way down to small startups who must automate to pull off what they’re doing with a skeleton crew.

Feed the Geeks

After lunch we marched ahead with the labs at full steam. We had each built out our agents, connected them to the Puppet master server and gotten our GitHub syncing our changes between the two. We then explored creating our own classes, modules and defining additional configuration parameters as well as interacting locally with Puppet apply and Puppet –Test … which doesn’t just Test, it runs the config, so watch that one!.  Once we had built our own classes it was time to learn how do we use them. The distinction between define and declare were one of the important lessons here.  To specify the contents and behavior of a class doesn’t automatically include it in a configuration; it simply makes it available to be declared. To direct Puppet to include or instantiate a given class it must be declared. To add classes you use the include keyword or the class {“foo”:} syntax.

Well, speaking of food it’s now dinner time and I’m looking forward to meeting up with some other #PuppetConf training attendees for good eats, discussion and probably an adult beverage or two. Won’t be out too late though, I want to get back into the room and try spinning up my own local instance of an all in one install of Puppet! Look for more updates tomorrow after Day Two of Puppet Fundamentals for System Administrators.

#vBrownBag LATAM en español grabaciones disponibles ya

Cross post from: http://professionalvmware.com/2013/03/vbrownbaglatamvideos2013ene/

The following content is in Spanish, the language in which the webinar is presented.

Desde que se anunció el lanzamiento de los vBrownBag en castellano como expansión de la plataforma ProfessionalVMware.com venimos grabando las presentaciones realizadas cada jueves a las 6:00 pm hora pacífico (PST) (02:00 UTC).

Feliz de anunciarles que finalmente hemos logrado publicar las grabaciones de las presentaciones realizadas en enero del ProfessionalVMware #vBrownBag LATAM en Español.

Para facilitar el aceso al contenido hemos decidido hacerlo disponible por un canal de YouTube vBrownbagLATAM

En seguida están los enlaces y el video de cada sesión

(tip: se puede ampliar a pantalla completa y cambiar la calidad de reprodución a HD 720p).

2013-ene-31 – Almacenamiento NFS / Larry González @virtualizeCR ProfessionalVMware #vBrownBag LATAM

2013-ene-24 – AutoLab / Ariel Antigua @aantigua  ProfessionalVMware #vBrownBag LATAM

2013-ene-17 – HomeLab / Álvaro Faúndez @alvaro_faundezm  ProfessionalVMware #vBrownBag LATAM

2013-ene-10 – Primer Sesión Informativa / @kylemurley ProfessionalVMware #vBrownBag LATAM

Para no perder el webinar cada semana invitamos inscribirse como participante

http://bit.ly/BrownbagLATAM

Seguimos recrutando presentadores

http://bit.ly/BrownbagPresenter

Los temas están por determinarse y serán representativos de la necesidad y capacidad de los participantes.

¡Anímese a compartir y aprender juntos con los miembros de esta, nuestra comunidad vBrownbag en español!