Backup or Store stuff to GMail via IMAP in Ruby

June 10th, 2008 by Daniele Muscetta

Once upon a time, I used to store some automated small backups into GMail just by having the scheduled backup send an email to my GMail account. At one stage they blocked me from doing so, marking those repeated email as SPAM.

After that, I took a different approach: I kept sending the mail on the SAME server as the backup, and using IMAP I could DRAG-and-DROP the backup attachment from the mailbox on one server to the mailbox on another server (=GMail). They did not mark me as a spammer that way, of course.
So that worked for a while, but then I got tired of doing this manually.

So the following ruby script is the way I automated the "move offsite" part of that backup.
For completeness, I will give the due credits about who set me on the right track: I started off by this example by Ryan.

#!/usr/bin/env ruby
begin_ =

require 'net/imap'

##Source Info

##Destination Info

#connect to source
puts "connecting to source server #{$SRCSERVER}... nn"

#connect to destination
puts "connecting to destination server #{$DSTSERVER}... nn"

# Loop through all messages in the source folder.
uids = srcimap.uid_search(['ALL'])
if uids.length > 0
	$count = uids.length
	puts "found #{$count} messages to move... nn"

	srcimap.uid_fetch(uids, ['ENVELOPE']).each do |data|
		mid = data.attr['ENVELOPE'].message_id

		# Download the full message body from the source folder.
		puts "reading message... #{mid}"
		msg = srcimap.uid_fetch(data.attr['UID'], ['RFC822', 'FLAGS', 'INTERNALDATE']).first

		# Append the message to the destination folder, preserving flags and internal timestamp.
		puts "copying message #{mid} to destination..."
		dstimap.append($DSTFOLDER, msg.attr['RFC822'], msg.attr['FLAGS'], msg.attr['INTERNALDATE'])

		#delete the msg
		puts "deleting messsage #{mid}..."
		srcimap.uid_store(data.attr['UID'], '+FLAGS', [:Deleted])



total_time = - begin_
puts "Done. RunTime: #{total_time} sec. nn"

CentOS 5 Management Pack for OpsMgr SCX

May 13th, 2008 by Daniele Muscetta

As I mentioned here, I have been testing the SCX beta.

Not having one of the "supported" platforms pushed me into playing with the provided Management Packs, and in turn I managed to use the MP for Red Hat Enterprise Linux 5 as a base, and replaced a couple of strings in the discoveries in order to get a working CentOS 5 Management Pack.


I still have not looked into the "hardware" monitors and health model / service model, so those are not currently monitored. But it is a start.

A lot of people have asked me a lot of information and would like to get the file – both in the blog's comment, on the newsgroup, or via mail. I am sorry, but I cannot provide you with the file, because it has not been throughly tested and might render your systems unstable, and also because there might be licensing and copyright issues that I have not checked within Microsoft.

Keep also in mind that using CentOS as a monitored platform is NOT a SUPPORTED scenario/platform for SCX. I only used it because I did not have a Suse or Redhat handy that day, and because I wanted to understand how the Management Packs using WS-Man worked.

This said, should you wish to try to do the same "MP Hacking" I did,  I pretty much explained all you need to know in my previous post and its comments, so that should not be that difficult.

Actually, I still think that the best way to figure out how things are done is by looking at the actual implementation, so I encourage you to look at the management packs and figure out how those work. There are a few mature tools out there that will help you author/edit Management Packs if you don't want to edit the XML directly: the Authoring Console, and Silect MP Studio Lite, for example. If you want to delve in the XML details, instead, then I suggest you read the Authoring Guide and peek at Steve Wilson's site.

The information in this weblog is provided "AS IS" with no warranties, and confers no rights. This weblog does not represent the thoughts, intentions, plans or strategies of my employer. It is solely my own personal opinion. All code samples are provided "AS IS" without warranty of any kind, either express or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.

Testing System Center Cross Plaform Extentions

May 4th, 2008 by Daniele Muscetta

I am testing the beta bits of the cross-platform extensions that were released on Microsoft Connect 

This post wants to describe my limited testing so far – I hope this can benefit/help everyone testing the beta for some stuff that might currently not be incredibly clear – unless you attended the MMS class, at least :-))

I started out with the White Paper that has been posted on the web, which describes the architecture pretty well, but from a higher level (with diagrams and the like). Then I downloaded the beta bits, which contain another document about setting the thing up. It is pretty well done, to be honest (especially if you consider that it is beta documentation for a beta product!), but it does not really go all the way down to troubleshooting things a lot, yet. I will try to cover some of that here.

I installed the agent manually – it’s just a RPM package, not much that can go wrong with that. There is a reason why I did not use the push discovery and deployment of the agent, which you will figure out reading later on. Once installed, I tried to figure out how things were looking like on the linux machine. It is all pretty understandable, after all, if you look around on the machine (documented or not, linux and open source stuff is easy to figure out by reading configuration files and the like, and by searching on the web).

Basically the “agent” is not properly an "agent" the way the windows agent is, since it does not really "sends" stuff to the Management Server on its own: It consists of a  couple of services/daemons, based on existing opensource projects, but configured in their own folder, with their own name, and using different ports than a standard install of those,  not to conflict with possible existing ones on those machines.

The Management Service uses these services remotely (similar to doing agentless monitoring towards a windows box) using these services. The two services are:

 scx-services commands

It is easy to figure out how they are layed out. Even if undocumented, you look at the processes

SCX processes

and you can figure out WHERE they live (/opt/microsoft/scx/bin/….) and where their configuration files are located (/etc/opt/microsoft/scx/conf …).

SCX Configuration

The files are self explanatory, and the documentation of the opensource projects can be found on the Internet: 

for wsmand

for cimd


I still have to delve into them properly as I would like to, but I already figured out a bunch of interesting things by quickly looking at them.

Agent Communication someone must have decided to “recycle” the 1270 port number that was used in MOM2005 :-) Basically openwsman listens as a SSL listener (with basic auth – connected via PAM module with the “regular” unix /etc/passwd users, so you can authenticate as those without having to define specific users for the service). So all that happens is that the Management Server asks things/executes WS-Man queries and commands on this channel. The Management Server connects every time to the agent on port 1270 using SSL, authenticates as “root” (or as the specified "Action Account") and does its stuff, or asks the agent to do it. So the communication is happening from the Management Server to the agent… not the other way around like it happens with Windows "agents". That’s why it feels to me more like an “agentless” thing, at least for what concerns the “direction” of traffic and who does the actual querying.

For the rest, the provided Management Packs have “normal” discoveries and “normal” monitors. Pretty much like the Windows Management Packs often discover thing by querying WMI, here they use WS-Man to run CIM queries against the Unix boxes.

The Service Model is totally cool to actually *SEE* in action, don’t you think so ?

Service Model


A few more debugging/troubleshooting information:

I searched a bit and found the documentation and forum to be useful to figure some things out. For example I banged my head a few times before managing to actually TEST a query from windows to linux using WINRM. This document helped a lot.

Of course you have to solve some other things such as DNS resolution AND trusting the self-issued certificates that the agent uses, first. Once you have done that, you can run test queries from the Windows box towards the Unix ones by using WinRM.

For example, this is how I tested what the discovery for a Linux RedHat Computer type should be returning (I read that by opening the MP in authoring console, as one would usually do for any MP):

winrm enumerate -username:root -password:password -r:https://centos:1270/wsman -auth:basic

If you need to test the query directly *ON* the linux box (querying the CIMD instead than WSMAND), the WBEMEXEC utility is packaged with the agent (under /opt/microsoft/scx/bin/tools ). It is not as easy as some windows administrators (that have used WBEMTEST or WMI Tools in the past) would hope, but not even that bad. Just to run a few queries to the CIM daemon locally it is not really interactive, so you need to create a XML file that looks like the following (basically you build the RAW request the way the CIMD accepts it):



<?xml version="1.0" ?>




<IMETHODCALL NAME="EnumerateInstanceNames">






<CLASSNAME NAME="SCX_OperatingSystem"/>








Once you have made such a file, you can execute the query in the file with the tool like the following:

./wbemexec -d2 query.xml


As you can see from here, CIMD uses HTTP already. This differs from Windows' WMI that uses RPC/DCOM. In a way, this is much simpler to troubleshoot, and more firewall-friendly.


I have not really found an activity or debug log for any of those components, yet… but in the end they are not doing anything ON THEIR OWN, unless asked by the MS…. So the “healthservice” logic is all on the MS anyway. Errors about failed discoveries, permissions of the Action Account user, and anything else will be logged by the HealthService on the Windows machine (the Management Server) that is actually performing monitoring towards the Unix box.

It really is *just* getting the WMI and WinRM-equivalent layer on linux/Unix up and running– after that, everything is done from windows anyway!

After this common management infrastructure has been provided, 3rd parties will be facilitated in writing *just* MPs, without having to worry about the TRANSPORT of information anymore.


As you have probably noticed from the screenshots and commandlines, I don’t have a “real” Redhat Enterprise Linux or “supported” linux distribution… Therefore I started my testing using CentOS 5 (which is very similar to RHEL 5) – the agent installed fine as you can see, but I was not getting anything really “discovered” – the MP had only found a “linux computer” but was not finding any “RedHat” or “SuSe” or any other "Operating System" instances… and if you are somewhat familiar with the way Operations Manager targeting works, you would understand that monitors are targeted at object classes. If I don't have any instance of those objects being discovered, NO MONITORING actually happens, even if the infrastructure is in place and the pieces are talking to each other:

 CentOS not discovered

Therefore my machine was not being monitored.

In the end, I actually even got it to work, but I had to create a new Management Pack (exporting and modifying the RHEL5 one as a base) that would actually search for different Property values and discover CentOS instead as if it were RedHat:

CentOS Discovered 

After importing my hacked Management Pack the machine started to be monitored. Here you can see Health Explorer in all of its glory:


Of course this is a hack I made just to have a test setup somewhat working and to familiarize myself with the SCX components. It is not guaranteed that my Management pack actually works on CentOS the way it is supposed to work and that there aren't other – more subtle – differences between RedHat and CentOS that will make it fail. I only modified a couple of Discoveries to let it discover the "Operating System" instance… everything else should follow, but not necessarily. One difference you see already in the screenshot above is that I am not yet seeing the hardware being monitored, so my hack is already only partially working and it is definitely something that won't be supported, so I cannot provide it here. Also, this is a beta, so I I think that the Management Packs will be re-released with following beta versions, and this change is something that would need to be re-done all over again. Also, the unsupported distribution is the reason why I installed the agent manually in the first place, as the "Discovery Wizard" would not really "agree" to go and let me install the agent remotely on an unsupported "platform!".

But I could not wait to see this working, while waiting two business days (we are on a weekend!) for confirmation that I am allowed to actually download a 30-day-unsupported-Trial of the "real" RedHat Enteprise Linux, so I cheated :-)




The information in this weblog is provided "AS IS" with no warranties, and confers no rights. This weblog does not represent the thoughts, intentions, plans or strategies of my employer. It is solely my own personal opinion. All code samples are provided "AS IS" without warranty of any kind, either express or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.

A Rant about Openness

May 2nd, 2008 by Daniele Muscetta

It is interesting to see that a bunch of open source projects written on and for the Microsoft platform grows and grows, and also nice to see that a lot of Microsoft employees are very active and aware of the open source ecosystem, rather than being stuck with only what the company makes. Phil Haack, in a post about an interview to Brad Wilson,  wisely writes:

"[…] What I particularly liked about this post was the insight Brad provides on the diverse views of open source outside and inside of Microsoft as well as his own personal experience contributing to many OSS projects. It's hard for some to believe, but there are developers internal to Microsoft who like and contribute to various open source projects. […]"

In fact, being made by Microsoft people or not, the list of open source software on CodePlex keeps growing too. Mentioning CodePlex and interviews, another interesting one is that of Sara Ford, Program Manager for CodePlex posted on Microspotting. But Microspotting is awesome in general. My favorite quote by her:

"[…] Hey. My name is Ariel and I'm the person you thought would never work at MSFT […]".

In fact, just as I do, she is running that blog on WordPress, posting her photos on Flickr, using a RSS feed on Feedburner and in general using a bunch of things that are out there that might be seen as "competing" with what Microsoft makes. In fact, this attitude towards other products and vendors on the market is what I am mainly interested in. Should we only use flagship products? Sure, when they help us, but not necessarily. Who cares? People's blogs are not, as someone would like them to be, a coordinated marketing effort. This is about real people, real geeks, who just want to share and communicate personal ideas and thoughts. I had a blog before being at Microsoft, after all. Obviously I had exposure to competing products. My server was running LAMP on Novell Netware in 2002 – after which I moved it to Linux. It is not a big deal. And if I try to put things in perspective, in fact, this is turning out to be an advantage. I am saying this, as the latest news about interoperability comes from MMS (Microsoft Management Summit): and that is the announcement that System Center Operations Manager will monitor Linux natively. I find this to be extremely exciting, and a step in the right direction… to say it all I am LOVING this!!! But at the same time I see some other colleagues in technical support that are worrying and being scared by this – "if we do monitor Linux and Unix, we are supposed to have at least some knowledge on those systems", they are asking. Right. We probably do. At the moment there are probably only a limited number of people that actually can do that, at least in my division. But this is because in the past they must have sacrificed their own curiosity to become "experts" in some very narrow and "specialized" thing. Here we go. On the opposite, I kept using Linux – even when other "old school" employees would call me names. All of a sudden, someone else realizes my advantage.  …but a lot of geeks already understood the power of exploration, and won't stop defining people by easy labels. Another cool quote I read the other day is what Jimmy Schementi has written in his Flickr profile:

"[…] I try to do everything, and sometimes I get lucky and get good at something […]".

Reading on his blog it looks like he also gave up on trying to write a Twitter plugin for MSNLive Messenger (or maybe he never tried, but at least I wanted to do that, instead) and wrote it for Pidgin instead.  Why did he do that ? I don't know, I suppose because it was quicker/easier – and there were API's and code samples to start from.

The bottom line, for me, is that geeks are interested in figuring out cool things (no matter what language or technology they use) and eventually communicating them. They tend to be pioneers of technologies. They try out new stuff. Open Source development is a lot about agility and "trying out" new things. Another passage of Brad's interview says:

"[…] That's true–the open source projects I contribute to tend to be the "by developer, for developer" kind, although I also consume things that are less about development […] Like one tool that I've used forever is the GIMP graphics editor, which I love a lot".

That holds true, when you consider that a lot of these things are not really mainstream. Tools made "by developer, for developer" are usually a sort of experimental ground. Like Twitter. Every geek is talking about Twitter these days, but you can't really say that it is mainstream. Twitter has quite a bunch of interesting aspects, though, and that's why geeks are on it. Twitter lets me keep up-to-date quicker and better (and with a personal, conversational touch) even better than RSS feeds and blogs do. Also, there are a lot of Microsofties on Twitter. And the cool thing is that yo can really talk to everybody, at any level. Not just everybody "gets" blogs, social networks, and microblogging. Of course you cannot expect everybody to be on top of the tech news, or use experimental technologies. So in a way stuff like Twitter is "by geeks, for geeks" (not really just for developers – there's a lot of "media" people on Twitter). Pretty much in the same way, a lot of people I work with (at direct contact, everyday) only found out about LinkedIN during this year (2008!). I joined Orkut and LinkedIN in 2004. Orkut was in private beta, back then. A lot of this stuff never becomes mainstream, some does. But it is cool to discover it when it gets born. How long did it take for Social Networking to become mainstream? So long that when it is mainstream for others, I have seen it for so long that I am even getting tired of it.

For some reason, geeks love to be pioneers. This is well expressed in a digression by Chris Pratley:

"[…] some of them we will be putting out on for the general public (you folks!) to try so we can understand how "normal" people would use these tools. Now of course, as we bloggers and blog-readers know, we're not actually normal – you could even debate whether the blogosphere is more warped than the set of Microsoft employees, who comprise an interesting cross-section of job types, experiences, and cultures. But I digress. […]"

But I have been digressing, too, all along. As usual.

Conversation about Blogs with a customer

March 28th, 2008 by Daniele Muscetta

I usually don't like mentioning specific facts that happened to me at work. But work is part of life, so even if this is mostly a personal blog, I cannot help myself but write about certain things that make me think when they happen.

When I end up having conversations such as this, I get really sad: I thought we had finally passed the arrogant period where we had to spoon-feed customers, and I thought we were now mature enough to consider them smart people and providing cool, empowering technologies for them to use. I also thought that pretty much everybody liked Microsoft finally opening up and actually talking TO people… not only talking them INTO buying something, something – but having real conversations.

I get sad when I find that people still don't seem to be accepting that, and wanting back the old model, instead. Kinda weird.


The conversation goes as follows (words are not exactly those – we were speaking Italian and I sort of reconstructed the conversation – you should get the sense of it anyway):



Me: "The SDK service allows you to do quite a lot of cool stuff. Unfortunately not all of that functionality is completely or always easily exposed in the GUI. That is, for example: it is very EASY to define overrides, but it can get very tricky to find them back once set. That's why you can use this little useful tool that the developer of that SDK service has posted on his blog…"

Cust: "…but we can't just read blogs here and there!"

Me: "Well, I mean, then you may have to wait for the normal release cycle. It might be that those improvements will make it in to the product. That might happen in months, if you are lucky, or maybe never. What's wrong if he publishes that on his blog, bypassing the bureaucracy crap, and makes your life easier with it RIGHT NOW?"

Cust: "It is not official, I want it in the product!"

Me: "I see, and even understand that. But right now that feature just isn't there. But you can use this tool to have it. Don't worry: it is not made by some random guy who wants to trojan your server! It is made by the very same developer who wrote the product itself…"

Cust: "It is not supported, what if it breaks something?"

Me: "So are all resource kit tools, in general. written by some dev guy in his free five minutes, and usually unsupported. Still very useful, though. Most of them. And they usually do work, you know that much, don't you?"

Cust: "But why on a blog?"

Me: "What's wrong with this? People are just trying to make customer's life easier by being transparent and open and direct in their communication, just talking RIGHT to the customers. People talking to people, bypassing the prehistoric bureaucracy structure of companies… the same happens on many other sites, just think for example… those are just tools that a support guy like me has written and wants to share because they might be useful…"

Cust: "But I can't follow/read all the blogs out there! I don't have time for it"

Me: "Why not? I have thousands of feeds in my aggregator and…"

Cust: "I don't have time and I don't want to read them, because I pay for support, so I don't expect this stuff to be in blogs"

Me: "Well, I see, since you pay for support, you are paying ME – in fact I am working with you on this product precisely as part of that paid support. That's why I am here to tell you that this tool exists, in case you had not heard of it, so you actually know about it without having to read that yourself on any blog… does that sound like a deal? Where's the issue?"

Cust: "Sgrunt. I want something official, I don't like this blog stuff"



I thought this was particularly interesting, not because I want to make fun of this person. I do respect him and I think he just has a different point of view. But in my opinion this conversation shows (and made me think about) an aspect of that "generation gap" inside Microsoft that Hugh talks about here:

"[…]4.30 Hugh talks about a conversation he had with a few people inside Microsoft- how there’s a generation gap growing within the company, between the Old Guard, and the new generation of Microsofties, who see their company in much more open, organic terms.[…]"

Basically this tells me that the generation gap is not happening only INSIDE Microsoft: it invests our customers too. Which makes it even more difficult to talk to some of them, as we change. Traditions are hard to change.

Ca(p)tching Cats and Dogs

March 9th, 2008 by Daniele Muscetta

I read on Jeff Atwood's blog about most strong Captcha having been defeated. Also, on top of visitors getting annoyed by it, the Captcha plugin I am using has gone unmantained lately. And, one way or another, I am getting comment spam again. Which is something I really hate as you know what I would love to do to spammers

I am seriously considering giving Asirra a try. It is an interesting project from Microsoft Research for an HIP (Human Interaction Proof) that uses info from to let users set apart pictures of dogs from those of cats. There is also a WordPress plugin, in the best and newest "we want to interoperate" fashion that we are finally getting at Microsoft (this has always been the way to go, IMHO, and BTW).

Anyway, what do you think ?

Looking at OpsMgr2007 Alert trend with Command Shell

January 25th, 2008 by Daniele Muscetta

It's friday night, I am quite tired and I can't be asked of writing a long post. But I have not written much all week, not even updated my Twitter, and now I want to finish the week with at least some goodies. So this is the turn of a couple of Powershell commands/snippets/scripts that will count alerts and events generated each day: this information could help you understand the trends of events and alerts over time in a Management Group. It is nothing fancy at all, but they can still be useful to someone out there. In the past (MOM 2005) I used to gather this kind of information with SQL Queries against the operations database. But now, with Powershell, everything is exposed as objects and it is much easier to get information without really getting your hands dirty with the database :-)

#Number of Alerts per day

$alerttimes = Get-Alert | Select-Object TimeRaised

foreach ($datetime in $alerttimes){
$array += $

$array | Group-Object Date

#Number of Events per day

$eventtimes = Get-Event | Select-Object TimeGenerated

foreach ($datetime in $eventtimes){
$array += $

$array | Group-Object Date

Beware that these "queries" might take a long time to execute (especially the events one) depending on the amount of data and your retention policy.

This is of course just scratching the surface of the amount of amazing things you can do with Powershell in Operations Manager 2007. For this kind of information you might want to keep an eye on the official "System Center Operations Manager Command Shell" blog:


January 14th, 2008 by Daniele Muscetta

A while ago, talking to some friends, I was mentioning how cool it was that Flickr provides APIs, so that you can always get your data out of it, if you want to. There are several downloader applications that I found on the Internet, but I have not yet chosen one that I completey like among the few that I've tried. So, inspired by Kosso's PHP script for enumerating your photos on Flickr, I thought I'd port it to Powershell and make my own version of it. Just for the fun of it. My Powershell script does not do everything that Kosso's one does: I don't build a web page showing description and comments. I suppose this is because the original script was made with PHP, which you usually run on a web server and outputting as HTML is the standard thing you would do in PHP. I just concentrated on the "download" thing, since mine it is a console script. You can think of mine as a "full backup" script. Full… well, at least of all your photos, if not of all the metadata. It should be trivial to extend anyway, also considering Powershell XML type accelerator really makes it extremely easy to parse the output of a REST API such as Flickr's (I would say even easier and more readable that PHP'simplexml). There is a ton of things that could be extended/improved in the script… including supporting proxy servers, accepting more parameters for things that are now hardcoded… and with a million other things. Even this way, though, I think that the script can be useful to show a number of techniques in Powershell. Or just to download your photos :-) So you can download the script from here: Get-FlickrPhotos.ps1


January 4th, 2008 by Daniele Muscetta

I just read from Jeffrey Snover about this newly born Italian PowerShell community site.

I just created an account for myself on the site… as you know I like PowerShell, so even if I usually prefer writing stuff in english, I will try to hang out there and see how can I contribute to it.

After all, I am italian… :-)

Simply Works

December 27th, 2007 by Daniele Muscetta

Simply Works

Simply Works, uploaded by Daniele Muscetta on Flickr.

I don't know about other people, but I do get a lot to think when the end of the year approaches: all that I've done, what I have not yet done, what I would like to do, and so on…

And it is a period when memories surface.

I found the two old CD-ROMs you can see in the picture. And those are memories.
missioncritical software was the company that invented a lot of stuff that became Microsoft's products: for example ADMT and Operations Manager.

The black CD contains SeNTry, the "enterprise event manager", what later became Operations Manager.
On the back of the CD, the company motto at the time: "software that works simply and simply works".
So true. I might digress on this concept, but I won't do that right now.

I have already explained in my other blog what I do for work. Well, that was a couple of years ago anyway. Several things have changed, and we are moving towards offering services that are more measurable and professional. So, since it happens that in a certain job you need to be an "expert" and "specialize" in order to be "seen" or "noticed".
You know I don't really believe in specialization. I have written it all over the place. But you need to make other people happy as well and let them believe what they want, so when you "specialize" they are happier. No, really, it might make a difference in your carrer :-)

In this regard, I did also mention my "meeting again" with Operations Manager.
That's where Operations manager helped me: it let me "specialize" in systems and applications management… a field where you need to know a bit of everything anyway: infrastructure, security, logging, scripting, databases, and so on… :-)
This way, everyone wins.

Don't misunderstand me, this does not mean I want to know everything. One cannot possibly know everything, and the more I learn the more I believe I know nothing at all, to be honest. I don't know everything, so please don't ask me everything – I work with mainframes :-)
While that can be a great excuse to avoid neighbours and relatives annoyances with their PCs though, on the serious side I still believe that any intelligent individual cannot be locked into doing a narrow thing and know only that one bit just because it is common thought that you have to act that way.

If I would stop where I have to stop I would be the standard "IT Pro". I would be fine, sure, but I would get bored soon. I would not learn anything. But I don't feel I am the standard "IT Pro". In fact, funnily enough, on some other blogs out there I have been referenced as a "Dev" (find it on your own, look at their blogrolls :-)). But I am not a Dev either then… I don't write code for work. I would love to, but I rarely actually do, other than some scripts. Anyway, I tend to escape the definition of the usual "expert" on something… mostly because I want to escape it. I don't see myself represented by those generalization.

As Phil puts it, when asked "Are software developers – engineers or artists?":

"[…] Don’t take this as a copout, but a little of both. I see it more as craftsmanship. Engineering relies on a lot of science. Much of it is demonstrably empirical and constrained by the laws of physics. Software is less constrained by physics as it is by the limits of the mind. […]"

Craftmanship. Not science.
And stop calling me an "engineer". I am not an engineer. I was even crap in math, in school!

Anyway, what does this all mean? In practical terms, it means that in the end, wether I want it or not, I do get considered an "expert" on MOM and OpsMgr… and that I will mostly work on those products for the next year too. But that is not bad, because, as I said, working on that product means working on many more things too. Also, I can point to different audiences: those believing in "experts" and those going beyond schemes. It also means that I will have to continue teaching a couple of scripting classes (both VBScript and PowerShell) that nobody else seems to be willing to do (because they are all *expert* in something narrow), and that I will still be hacking together my other stuff (my facebook apps, my wordpress theme and plugins, my server, etc) and even continue to have strong opinions in those other fields that I find interesting and where I am not considered an *expert* 😉

Well, I suppose I've been ranting enough for today…and for this year :-)
I really want to wish everybody again a great beginning of 2008!!! What are you going to be busy with, in 2008 ?

Role Playing | Technology

November 10th, 2007 by Daniele Muscetta

Role Playing | Technology

Role Playing | Technology, uploaded by Daniele Muscetta on Flickr.

I had not been playing Role Playing Games anymore for nearly 15 years. My wife recently thought that Joshua would be big enough to try, so I am trying to introduce him to the world of RPGs. This, as you can imagine, after all of that time, took back memories, ideas, and also made me think of how much the technology changed this all.

I am not at all referring to VIDEO or ONLINE games, even those that are marketed as being RPGs: most of them are not "real" RPGs anyway, they merely borrow some rules. I am saying that technology changed the way people ORGANIZE and prepare their role playing gaming experience (=the one played with real RPGs where you have to ACT a character), and how they interact with each other, and how the "knowledge" spreads.

When I was playing RPG a lot, in the 80's and early 90's, everything was paper-based, no Internet and technology in sight. For example, we photocopied a lot of stuff back then, as opposed to today when I just downloaded and printed a character sheet. But it was not just printed material that was being photocopied: in those years I remember myself handwriting my own extended set of rules, manuals, scenarios, description of places (I even kept and found back some of those!). Everything was handwritten: text, drawings, maps. A lot of work, very hard to mantain. But passion was driving me (and my friends at that time too). That has also been a big enabler in how I taught myself to read and write english: by translating handbooks that nobody had translated in italian. But I digress.

We use to go to a couple of highly specialized shops that were able to import and resell one or two copies of some rare handbooks of a strange game that would otherwise not sell at all. Sometimes even the specialized shops did not manage to get the originals of some of those rare books. Therefore, some of the expansions were sold as photocopies.
Some other times there had been some guy somewhere who did have one copy bought in the US and he took the effort to make an UNofficial translation and TYPEWRITE it in italian. Photocopies of this "product" was all that was circulating.

I am not talking or caring of copyright or "pirate" issues here. We were not "avoiding" the original stuff: if anybody would have told us that the stuff we wanted was actually available in its original format, we would have bought it. But it just wasn't available at all, and we wanted it. This kind of material was really close to impossible to get, with high costs, and all that us busy kids wanted was books with descriptions of imaginary fantasy worlds to place our characters in, and improvise and narrate our stories and saga's…

Also, all in paper format, what was circulating was a certain number of fanzines, also photocopies of an original, wonderful, "master copy" that someone had made with a typewriter and sticking pictures with glue on the paper. Desktop publishing was not that common nor easy yet. But the layout is not really what interested me, it was the CONTENT that was hard to spread.

At one stage, the thing improved slightly: I finally managed to convince my parents that I was allowed to get a modem, so I started using it to connect to various BBS. A couple of those BBS of the time were related to RPGs or had a related discussion area. I was interested in technology and in knowing how it was doing its magic, but most of all I was also pretty excited at the possibilities I saw for the technology as an enabler in connecting people. Just like I am now.
I have met some good friends on BBS's at that time. I'm still in contact with some of them, I've lost some other ones, like it happens in life anyway. But the possibility was showing quite clearly: those BBS were mostly text-based, with high connection costs (in italy were you pay every call, also local ones, per minute)… even in those circumstances they were managing to aggregate some people and were used as vehicles to spread the knowledge.
In Italy, thought, they were mostly local. International calls were prohibitively expensive. Of course we did hear of what happened to similar BBS in the US.

In fact, after pencil and paper, through a typewriter, the revolution started there: being able to type stuff on a computer and pass your file over to someone else made it easier for it to spread. But again, I am not talking about copyrighted material. I am mostly talking about self-produced material. I still remember I had troubles with digitalizing maps because I did not own a scanner… on some of the BBS people were sharing their works, and you could find good adventures and extra stuff on them. I also got to publish somewhere a couple of those I had written, and they even made it on a fanzine first, and then on a real magazine.

At one stage, though, I really got distracted. I probably thought I was "big enough", or I got too interested in the "serious" computing business, or I was too busy with other stuff. Probably a combination of many factors. So I sort of abandoned playing for a long time.

Now, looking back at that world, more than a decade later, I can see how it all changed: you go to the Internet, use any search engine and find dozen if not hundred of sites with forums, people playing online using Live Messenger, people sharing their adventures or their stories of the adventures they have played, other sites that collect all of the covers and information about all the booklets and manuals ever existed for any possible version of any game. Even the vendors are giving out stuff to play for free.

PCs and the Internet DID change the world, if anyone was still doubting. And yes, Role Playing Games and computing ARE related interests.

The world changed, yet it stayed the same: you still play those games with people, with the help of your imagination. It's the resources that are now at your fingertips.

Live Photo Gallery and Flickr

October 25th, 2007 by Daniele Muscetta

I actually read this (Live Photo Gallery  allowing you to post to Flickr) a couple of days ago in an internal mail, and – even tough I Love Flickr – I have been extremely quiet and cautious and I did not blog about it. In fact I felt like waiting about blogging this GREAT new, because I thought that it was internal-only, confidential information, and I was worried that someone would tell me off :-(

In the end it turns out that I did not have to wait or be worried, since the cat was already out of the bag!!!

[As a side note, it happens a lot of times that stuff gets public much earlier than when I actually read that internally. In those internal communication it very often is still considered "confidential" when the whole world is speaking about it…. I don't get this whole "confidentiality" thing in these days of porous membranes…]

Security Fixes ISO images

October 25th, 2007 by Daniele Muscetta

I learn now from Robert Hensing that Microsoft provides ISO images of DVD containing the security fixes for those who can't do an online update due to bandwidth and other constraints. It has probably been there for ages, only I had missed it. And if I have missed it, I am quite sure that a lot of other people have missed it too. So, it does not hurt to "echo" it :-)

Test from WordPress 2.3

September 26th, 2007 by Daniele Muscetta

Blog works, all the plugin work too. I will *only* have to re-write a whole bunch on SQL queries for my .Net frontend that is now broken. I'll do that at one stage, now I can't be asked.

Ubuntu on Virtual PC 2007

September 26th, 2007 by Daniele Muscetta

Ubuntu on Virtual PC 2007

Ubuntu on Virtual PC 2007, uploaded by Daniele Muscetta on Flickr.

This was a VMWare "virtual appliance" with Ubuntu that I was using for testing. As I mostly use Virtual PC or Virtual Server, I found it annoying having to switch to VMWare player to use that specific machine, and I could not be asked to install a new one. So I converted the .VMDK to .VHD format (the other way around than it is described on this article ).

After that, I had to change GRUB's configuration to inform it that the SCSI disk (/dev/sda1) was all of a sudden become an IDE one (/dev/hda1), and then I also had to reconfigure X.

After that it runs like a charme!!!

Windows Live Install on 2003 Server ?

September 13th, 2007 by Daniele Muscetta

Windows Live Install on 2003 Server ?

I used to have Windows Live Writer and Windows Live Messenger on my Windows 2003 Server box. Now, this new fantastic integrated setup says it won't install on this operating system. Ridiculous. You read the release notes, and in fact it only says Windows XP and Vista.

I see.

Well, I happen to use a Windows 2003 Server at home – the same machine for day to day use (like writing this post or checking private email) and doing some study/testing. I don't have loads of machines. I don't actually have money for a new machine (even if I would really need a new one to test stuff).
I try to do more with less.

Well, if this does not install, what am I supposed to do ?
I want to chat with people, which means I'll keep using Pidgin on this machine. That way I also have my GTalk, ICQ and Yahoo buddies all in one place. And it eats up much less memory that the "real" live messenger. And without advertisements. How nice.

I am sorry when my employer does this kind of stupid things. This is not interoperability. It does not even work on OUR operating systems!

As for Windows Live Writer, read Phil's post. It seems like FrontPage, all over again.
For writing this post I've used Flickr.
Since I happen to post quite a bunch of photos or images on my blog, I find it ideal. The ONLY thing Flickr is missing, when used as a blogging tool, is the ability to post tags/categories too. Otherwise it would be perfect.

It's nice to see things called by their real name

September 3rd, 2007 by Daniele Muscetta

Facebook Terms of Service state that it is forbidden to "[…] use automated scripts to collect information from or otherwise interact with the Service or the Site […]"

For this reason, I had to pull down the code of the small application I had previously released, which was "logging" into the mobile web application "pretending" to be a mobile browser and change your status. Big deal!!!

I am quite sure there are a lot of people writing "official" applications (that is using the "platform API" and so on) that are collecting A LOT of information about users who install their applications. They are being sent the info about the visitors by facebook, they are storing them, they might do whatever they please with (study it, sell it to spammers, to marketers, to making-money-assholes) and nobody will ever notice because it is on their servers and nobody can check that.

But a script that changes your status from remote – since this is not a functionality they CHOSE to expose in their API – then THAT is a big issue. Doh!
It's just plain ridiculous, but that's it.

Sure, the terms of service for app developers say a bit more in this regard:

4) Except as provided in Section 2.A.6 below, you may not continue to use, and must immediately remove from any Facebook Platform Application and any Data Repository in your possession or under your control, any Facebook Properties not explicitly identified as being storable indefinitely in the Facebook Platform Documentation within 24 hours after the time at which you obtained the data, or such other time as Facebook may specify to you from time to time;

5) You may store and use indefinitely any Facebook Properties that are explicitly identified as being storable indefinitely in the Facebook Platform Documentation; provided, however, that except as provided in Section 2.A.6 below, you may not continue to use, and must immediately remove from any Facebook Platform Application and any Data Repository in your possession or under your control, any such Facebook Properties: (a) if Facebook ceases to explicitly identify the same as being storable indefinitely in the Facebook Platform Documentation; (b) upon notice from Facebook (including if we notify you that a particular Facebook User has requested that their information be made inaccessible to that Facebook Platform Application); or (c) upon any termination of this Agreement or of your use of or participation in Facebook Platform;
You will not directly or indirectly sell, export, re-export, transfer, divert, or otherwise dispose of any Facebook Properties to any country (or national thereof) without obtaining any required prior authorizations from the appropriate government authorities;

Are we sure everybody is playing by these rules, when every facebook "application" really runs on the developer'server ? How do you know that they are really storing only what you want them to store, and deleting what you want them to delete ? Everybody knows how difficult it is to really "delete" digital content once it has come into existance… who knows how many copies of this database/social graph are floating around ?

Of course that is not an issue because people don't talk about it enough. But a script that changes your status – now, THAT is a very terrible thing.

I just don't get this "politically correctness". It must be me.

Oh, no… look! It's not only me!
I had read this post of Dare, but I problably had overlooked the last bit of it…. because he did point out this Hypocrisy going on:

Or (5) the information returned by FQL about a user contains no contact information (no email address, no IM screen names, no telephone numbers, no street address) so it is pretty useless as a way to utilize one’s friends list with applications besides Facebook since there is no way to cross-reference your friends using any personally identifiable association that would exist in another service.

When it comes to contact lists (i.e. the social graph), Facebook is a roach motel. Lots of information about user relationships goes in but there’s no way for users or applications to get it out easily. Whenever an application like FacebookSync comes along which helps users do this, it is quickly shut down for violating their Terms of Use. Hypocrisy? Indeed.

He then insists in a more recent post in calling things by their name:

I will point out that 9 times out of 10 when you hear geeks talking about social network portability or similar buzzwords they are really talking about sending people spam because someone they know joined some social networking site. I also wonder how many people realize that these fly-by-night social networking sites that they happily hand over their log-in credentials to so they can spam their friends also share the list of email addresses thus obtained with services that resell to spammers?
how do you prevent badly behaved applications like Quechup from taking control away from your users? At the end of the day your users might end up thinking you sold their email addresses to spammers when in truth it was the insecure practices of the people who they’d shared their email addresses with that got them in that mess. This is one of the few reasons I can understand why Facebook takes such a hypocritical approach. :)

Thanks, Dare, for mentioning Hypocrisy. Thanks for calling things by their name. I do understand their approach, I just don't agree with it.

I did pull my small application off the Internet because I have a family to mantain and I don't want to have legal troubles with Facebook. Sorry to all those that found it handy. No, I cannot even give that to you per email. It's gone. I am sorry. For the freedom of speech, especially, I am sorry.

I will change my status more often on Twitter.

per incollare carte, stoffe, fotografie, etc…

August 31st, 2007 by Daniele Muscetta

per incollare carte, stoffe, fotografie, etc...

Coccoina, a piece of Italian history.

"Italy, for example, is a puzzle […]. Family businesses, therefore, form the backbone of the Italian economy. There are businesses which grow rich by doing small things very well. […] "Better not bigger" is their preferred route to to wealth because bigger inevitably means the eventual sharing of power with people you cannot know well enough to trust." – quote: Charles Handy – "Beyond Certainty"

Open Source Projects and Microsoft

August 24th, 2007 by Daniele Muscetta

This CNet article about CodePlex has some VERY interesting points:

[…] Bayarsaikhan has posted the top 25 most active open-source projects on Microsoft's Codeplex site. Looking at the list, it looks like Microsoft developers spend their time doing much the same as the rest of the Java/other world: play games and make the Web world pretty with AJAX. You can see the top project interests below in the Codeplex tag cloud.

Codeplex is interesting to me for several reasons, but primarily because it demonstrates something that I've argued for many years now: open source on the Windows platform is a huge opportunity for Microsoft. It is something for the company to embrace, not despise.

And it does several things well (better than Sourceforge, in my opinion) […]


August 23rd, 2007 by Daniele Muscetta


Tafiti, uploaded by Daniele Muscetta on Flickr.

Try it out.