Archive for the 'Networking' Category

RSS Feed for the 'Networking' Category

Three quarters of 2015, my IT career and various ramblings

Monday, October 5th, 2015

September is over. The first three quarters of 2015 are over.
This has been a very important year so far – difficult, but revealing. Everything has been about change, healing and renewal.

We moved back to Europe first, and you might have now also read my other post about leaving Microsoft, more recently.

This was a hard choice – it took many months to reach the conclusion this is what I needed to do.

Most people have gone thru strong programming: they think you have to be 'successful' at something. Success is externally defined, anyhow (as opposed to satisfaction which we define ourselves) and therefore you are supposed to study in college a certain field, then use that at work to build your career in the same field… and keep doing the same thing.

I was never like that – I didn't go to college, I didn't study as an 'engineer'. I just saw there was a market opportunity to find a job when I started, studied on the job, eventually excelled at it. But it never was *the* road. It just was one road; it has served me well so far, but it was just one thing I tried, and it worked out.
How did it start? As a pre-teen, I had been interested in computers, then left that for a while, did 'normal' high school (in Italy at the time, this was really non-technological), then I tried to study sociology for a little bit – I really enjoyed the Cultural Anthropology lessons there, and we were smoking good weed with some folks outside of the university, but I really could not be asked to spend the following 5 or 10 years or my life just studying and 'hanging around' – I wanted money and independence to move out of my parent's house.

So, without much fanfare, I revived my IT knowledge: upgraded my skill from the 'hobbyist' world of the Commodore 64 and Amiga scene (I had been passionate about modems and the BBS world then), looked at the PC world of the time, rode the 'Internet wave' and applied for a simple job at an IT company.

A lot of my friends were either not even searching for a job, with the excuse that there weren't any, or spending time in university, in a time of change, where all the university-level jobs were taken anyway so that would have meant waiting even more after they had finished studying… I am not even sure they realized this until much later.
But I just applied, played my cards, and got my job.

When I went to sign it, they also reminded me they expected hard work at the simplest and humblest level: I would have to fix PC's, printers, help users with networking issues and tasks like those – at a customer of theirs, a big company.
I was ready to roll up my sleeves and help that IT department however I would be capable of, and I did.
It all grew from there.

And that's how my IT career started. I learned all I know of IT on the job and by working my ass off and studying extra hours and watching older/more expert colleagues and making experience.

I am not an engineer.
I am, at most, a mechanic.
I did learn a lot of companies and the market, languages, designs, politics, the human and technical factors in software engineering and the IT marketplace/worlds, over the course of the past 18 years.

But when I started, I was just trying to lend a honest hand, to get paid some money in return – isn't that what work was about?

Over time IT got out of control. Like Venom, in the Marvel comics, that made its appearance as a costume that SpiderMan started wearing… and it slowly took over, as the 'costume' was in reality some sort of alien symbiotic organism (like a pest).

You might be wondering what I mean. From the outside I was a successful Senior Program Manager of a 'hot' Microsoft product.
Someone must have mistaken my diligence and hard work for 'talent' or 'desire of career' – but it never was.
I got pushed up, taught to never turn down 'opportunities'.

But I don't feel this is my path anymore.
That type of work takes too much metal energy off me, and made me neglect myself and my family. Success at the expense of my own health and my family's isn't worth it. Some other people wrote that too – in my case I stopped hopefully earlier.

So what am I doing now?

First and foremost, I am taking time for myself and my family.
I am reading (and writing)
I am cooking again
I have been catching up on sleep – and have dreams again
I am helping my father in law to build a shed in his yard
We bought a 14-years old Volkswagen van that we are turning into a Camper
I have not stopped building guitars – in fact I am getting setup to do it 'seriously' – so I am also standing up a separate site to promote that activity
I am making music and discovering new music and instruments
I am meeting new people and new situations

There's a lot of folks out there who either think I am crazy (they might be right, but I am happy this way), or think this is some sort of lateral move – I am not searching for another IT job, thanks. Stop the noise on LinkedIn please: I don't fit in your algorithms, I just made you believe I did, all these years.

Repost: Useful SetSPN tips

Wednesday, October 19th, 2011

I just saw that my former colleague (PFE) Tristan has posted an interesting note about the use of SetSPN “–A” vs SetSPN “–S”. I normally don’t repost other people’s content, but I thought this would be useful as there are a few SPN used in OpsMgr and it is not always easy to get them all right… and you can find a few tricks I was not aware of, by reading his post.

Check out the original post at

Does anyone have a new System Center sticker for me?

Saturday, November 27th, 2010

Does anyone have a new System Center sticker?

I got this sticker last APRIL at MMS2010 in JUST ONE COPY, and I waited till I got a NEW laptop in SEPTEMBER to actually use that…
It also took a while to stick it on properly (other than to re-install the PC as I wanted…),  but this week they told me that, for an error, I got given the wrong machine (they did it all themselves, tho – I did not ask for any specific one) and this one needs to be replaced!!!!

This is WORSE than any hardware FAILure, as the machine just works very well and I was expecting to keep it for the next two years :-(

Can anyone be so nice to send me one of those awesome stickers again? :-)

Programmatically Check for Management Pack updates in OpsMgr 2007 R2

Saturday, November 29th, 2008

One of the cool new features of System Center Operations Manager 2007 R2 is the possibility to check and update Management Packs from the catalog on the Internet directly from the Operators Console:

Select Management Packs from Catalog

Even if the backend for this feature is not yet documented, I was extremely curious to see how this had actually been implemented. Especially since it took a while to have this feature available for OpsMgr, I had the suspicion that it could not be as simple as one downloadable XML file, like the old MOM2005's MPNotifier had been using in the past.

Therefore I observed the console's traffic through the lens of my proxy, and got my answer:

ISA Server Log

So that was it: a .Net Web Service.

I tried to ask the web service itself for discovery information, but failed:


Since there is no WSDL available, but I badly wanted to interact with it, I had to figure out: what kind of requests would be allowed to it, how should they be written, what methods could they call and what parameters should I pass in the call. In order to get started on this, I thought I could just observe its network traffic. And so I did… I fired up Network Monitor and captured the traffic:

Microsoft Network Monitor 3.2

Microsoft Network Monitor is beautiful and useful for this kind of stuff, as it lets you easily identify which application a given stream of traffic belongs to, just like in the picture above. After I had isolated just the traffic from the Operations Console, I then saved those captures packets in CAP format and opened it again in Wireshark for a different kind of analysis – "Follow TCP Stream":

Wireshark: Follow TCP Stream

This showed me the reassembled conversation, and what kind of request was actually done to the Web Service. That was the information I needed.

Ready to rock at this point, I came up with this Powershell script (to be run in OpsMgr Command Shell) that will:

1) connect to the web service and retrieve the complete MP list for R2 (this part is also useful on its own, as it shows how to interact with a SOAP web service in Powershell, invoking a method of the web service by issuing a specially crafted POST request. To give due credit, for this part I first looked at this PERL code, which I then adapted and ported to Powershell);

2) loop through the results of the "Get-ManagementPack" opsmgr cmdlet and compare each MP found in the Management Group with those pulled from the catalog;

3) display a table of all imported MPs with both the version imported in your Management Group AND the version available on the catalog:

Script output in OpsMgr Command Shell

Remember that this is just SAMPLE code, it is not meant to be used in production environment and it is worth mentioning again that OpsMgr2007 R2 this is BETA software at the time of writing, therefore this functionality (and its implementation) might change at any time, and the script will break. Also, at present, the MP Catalog web service still returns slightly older MP versions and it is not yet kept in sync and updated with MP Releases, but it will be ready and with complete/updated content by the time R2 gets released.


The information in this weblog is provided "AS IS" with no warranties, and confers no rights. This weblog does not represent the thoughts, intentions, plans or strategies of my employer. It is solely my own personal opinion. All code samples are provided "AS IS" without warranty of any kind, either express or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.

A Rant about Openness

Friday, May 2nd, 2008

It is interesting to see that a bunch of open source projects written on and for the Microsoft platform grows and grows, and also nice to see that a lot of Microsoft employees are very active and aware of the open source ecosystem, rather than being stuck with only what the company makes. Phil Haack, in a post about an interview to Brad Wilson,  wisely writes:

"[…] What I particularly liked about this post was the insight Brad provides on the diverse views of open source outside and inside of Microsoft as well as his own personal experience contributing to many OSS projects. It's hard for some to believe, but there are developers internal to Microsoft who like and contribute to various open source projects. […]"

In fact, being made by Microsoft people or not, the list of open source software on CodePlex keeps growing too. Mentioning CodePlex and interviews, another interesting one is that of Sara Ford, Program Manager for CodePlex posted on Microspotting. But Microspotting is awesome in general. My favorite quote by her:

"[…] Hey. My name is Ariel and I'm the person you thought would never work at MSFT […]".

In fact, just as I do, she is running that blog on WordPress, posting her photos on Flickr, using a RSS feed on Feedburner and in general using a bunch of things that are out there that might be seen as "competing" with what Microsoft makes. In fact, this attitude towards other products and vendors on the market is what I am mainly interested in. Should we only use flagship products? Sure, when they help us, but not necessarily. Who cares? People's blogs are not, as someone would like them to be, a coordinated marketing effort. This is about real people, real geeks, who just want to share and communicate personal ideas and thoughts. I had a blog before being at Microsoft, after all. Obviously I had exposure to competing products. My server was running LAMP on Novell Netware in 2002 – after which I moved it to Linux. It is not a big deal. And if I try to put things in perspective, in fact, this is turning out to be an advantage. I am saying this, as the latest news about interoperability comes from MMS (Microsoft Management Summit): and that is the announcement that System Center Operations Manager will monitor Linux natively. I find this to be extremely exciting, and a step in the right direction… to say it all I am LOVING this!!! But at the same time I see some other colleagues in technical support that are worrying and being scared by this – "if we do monitor Linux and Unix, we are supposed to have at least some knowledge on those systems", they are asking. Right. We probably do. At the moment there are probably only a limited number of people that actually can do that, at least in my division. But this is because in the past they must have sacrificed their own curiosity to become "experts" in some very narrow and "specialized" thing. Here we go. On the opposite, I kept using Linux – even when other "old school" employees would call me names. All of a sudden, someone else realizes my advantage.  …but a lot of geeks already understood the power of exploration, and won't stop defining people by easy labels. Another cool quote I read the other day is what Jimmy Schementi has written in his Flickr profile:

"[…] I try to do everything, and sometimes I get lucky and get good at something […]".

Reading on his blog it looks like he also gave up on trying to write a Twitter plugin for MSNLive Messenger (or maybe he never tried, but at least I wanted to do that, instead) and wrote it for Pidgin instead.  Why did he do that ? I don't know, I suppose because it was quicker/easier – and there were API's and code samples to start from.

The bottom line, for me, is that geeks are interested in figuring out cool things (no matter what language or technology they use) and eventually communicating them. They tend to be pioneers of technologies. They try out new stuff. Open Source development is a lot about agility and "trying out" new things. Another passage of Brad's interview says:

"[…] That's true–the open source projects I contribute to tend to be the "by developer, for developer" kind, although I also consume things that are less about development […] Like one tool that I've used forever is the GIMP graphics editor, which I love a lot".

That holds true, when you consider that a lot of these things are not really mainstream. Tools made "by developer, for developer" are usually a sort of experimental ground. Like Twitter. Every geek is talking about Twitter these days, but you can't really say that it is mainstream. Twitter has quite a bunch of interesting aspects, though, and that's why geeks are on it. Twitter lets me keep up-to-date quicker and better (and with a personal, conversational touch) even better than RSS feeds and blogs do. Also, there are a lot of Microsofties on Twitter. And the cool thing is that yo can really talk to everybody, at any level. Not just everybody "gets" blogs, social networks, and microblogging. Of course you cannot expect everybody to be on top of the tech news, or use experimental technologies. So in a way stuff like Twitter is "by geeks, for geeks" (not really just for developers – there's a lot of "media" people on Twitter). Pretty much in the same way, a lot of people I work with (at direct contact, everyday) only found out about LinkedIN during this year (2008!). I joined Orkut and LinkedIN in 2004. Orkut was in private beta, back then. A lot of this stuff never becomes mainstream, some does. But it is cool to discover it when it gets born. How long did it take for Social Networking to become mainstream? So long that when it is mainstream for others, I have seen it for so long that I am even getting tired of it.

For some reason, geeks love to be pioneers. This is well expressed in a digression by Chris Pratley:

"[…] some of them we will be putting out on for the general public (you folks!) to try so we can understand how "normal" people would use these tools. Now of course, as we bloggers and blog-readers know, we're not actually normal – you could even debate whether the blogosphere is more warped than the set of Microsoft employees, who comprise an interesting cross-section of job types, experiences, and cultures. But I digress. […]"

But I have been digressing, too, all along. As usual.

Why do developers tend to forget about people behind proxy servers ?

Monday, August 13th, 2007

I know this is a very common issue.

I keep finding way too many software that claim to interact with Web 2.0 sites or services, and connect here or there…. still forgetting one basic simple rule, that is: letting people use a proxy.

Most programmers for some reasons just assume that since they are directly connected to the internet, everybody is. Which isn't always the case. Most companies have proxies and will only let you out to port 80 – by using their proxy.

…which in turn is one of the reasons why most applications now "talk" and tunnel whatever application protocol on top of HTTP… still a lot of softwares simply "forget" or don't care proving a simple checkbox "use proxy", which will translate in two or three extra lines of code… three lines which I personally usually include in my projects, when I am not even a *developer*!! (but that might explain why I *think* of it… I come from a security and networking background :-))

I thought of writing this post after having read this post by Saqib Ullah.

Anyway. I keep finding this thing over and over again. Both in simple, hobbyist, sample and/or in complex, big, expensive enterprise software. Last time I got pissed off about a piece of code missing this feature was some days ago when testing The previous time was during Windows Vista beta-testing (I had found a similar issue in beta2, and had it fixed for RC1.)

Actually, I am being polite saying it is "missing a feature". To be honest I think missing this "feature" would have to be considered a bug: every piece of software using HTTP *should* include the possibility to pass thorugh proxy (also, don't forget about  AUTHENTICATED proxies), or the purpose of using HTTP in the first place is defeated!!

Developers!!! You have to remember people ARE behind proxies !!!!!

MOM 2005 Alerts to RSS feed

Thursday, March 22nd, 2007

I am an RSS Addict, you know that.So I wanted an RSS Feed to show MOM Alerts. I have been thinking of it for a while, last year (or was it the year before?).
It seemed like a logical thing to me: alerts are created (and can be resolved – that is, expire), generally get sorted by the date and the time when they have been created, the look pretty much like a list. Also, many people like to receive mail notification when new alerts are generated.
So, if the alert can be sent to you (push), you could also get to it(pull).
Pretty much the same deal with receiving a mail or reading a newsgroup, or syndicating a feed.

At the time I looked around but it seemed like no one had something like this already done.
So I wrote a very simple RSS feed generator for MOM Alerts.
I did it quite an amount of time ago, just as an exercise.
Then, after a while, I figured out that the MOM 2005 Resource Kit had been updated to include such a utility!

Wow, I thought, they finally added what I have been thinking for a while. Might it be because I mentioned it on an private Mailing list ? Maybe. Maybe not. Who cares. Of course, if it is included in the resource kit it must be way cooler than the one I made, I though.
I really thought something along these lines, but never actually had the time to try it out.
I think I just sort of assumed it must have been cooler than the one I made, since it was part of an official package, while I am not a developer. So I basically forgot about the one I wrote, dismissing it as being crap without looking too much into it anymore.
Until today.
Today I actually tried to use the alert to RSS tool included in the resource kit, because a customer asked if there was any other way to get notified, other than receiving notification or using the console (or the console notifier).
So I looked at the resource kit's Alert-to-RSS Utility.
My experience with it:
1) it is provided in source code form – which is ok if it was ALSO provided as source. Instead it is ONLY provided as source, and most admins don't have Visual Studio installed or don't know how to compile from the command line;
2) Even when they wanted to compile it, it includes a bug which makes it impossible to compile – solution in this newsgroup discussion;
3) if you don't want to mess about with code since you are using a resource Kit tool (as opposed to something present in the SDK) you can even get it already compiled by someone from somewhere on the net, but that choice is about trust.

Anyway, one way or another, after it is finally set up…. surprise surprise!!!
It does NOT show a LIST of alerts (as I was expecting).
It shows a summary of how many alerts you have. basically it is an RSS feed made of a single item, and this single item tells you how many alerts you have. What is one supposed to do with such a SUMMARY? IMHO, it is useless the way it is. It is even worse than one of those feed that only contains the excerpt of the article, rather than the full article.
Knowing that I have 7 critical errors and 5 warning without actually knowing ANYTHING of them is pointless.
It might be useful for a manager, but not for a sysadmin, at least.

So I thought my version, even if coded crap, might be useful to someone because it gives you a list of alerts (those that are not resolved) and each one of them tells you the description of the alert, the machine tat generated it, and includes links to the actual alert in the web console, so you can click, go there, and start troubleshooting from within your aggregator!
My code does this. Anyway, since I am a crap coder, since I wrote it in only fifteen minutes more than a year ago, and since I don't have time to fix it and make it nicer… it has several issues, and could be improved in a million ways, in particular for the following aspects:

  1. is currently depends on the SDK Database views – it could use the MOM Server API's or the webservice instead;
  2. it uses SQL Security to connect to the DB – by default MOM does not allow this – it is suggested for the SQL instance hosting "OnePoint" to only use Windows Integrated Authentication.. so to make my code work you have to switch back to Mixed mode, and create a login in SQL that has permission to read the database. This is due to the fact that I've coded this in five minutes and I don't know how to use delegation – if I was able to use delegation, I would… so that the end user accessing IIS would be the one connecting to the DB. If anybody wants to teach me how to do this, I will be most grateful.
  3. it could accept parameters as URL variables, so to filter out only events for a specific machine, or a specific resolution state, etc etc
  4. At present it uses RSS.Net to generate the feed. It could made independent from it, but I don't really see why, and I quite like that library.

The code is just an ASP.Net page and its codebehind, no need to compile, but of course you need to change a couple of lines to match your webconsole address.
Also, you need to get RSS.NET and copy its library (RSS.Net.dll) in the /bin subfolder of the website directory where you place the RSSFeed generator page. I see that I wrote this with version 0.86, but any version should do, really.

Here is what it will look like:


And here's the code of the page (two files):


<%@ Page Language="C#" AutoEventWireup="true" CodeFile="Default.aspx.cs" Inherits="_Default" %>


using System;
using System.Data;
using System.Data.SqlClient;
using System.Configuration;
using System.Web;
using Rss;

public partial class _Default : System.Web.UI.Page
protected void Page_Load(object sender, EventArgs e)
string webconsoleaddress = "" // must change to match your address

// Inizializza il Feed
RssChannel rssChannel = new RssChannel();
rssChannel.Title = "MOM Alerts"
rssChannel.PubDate = DateTime.Now;
rssChannel.Link = new Uri(""); // must change to match your address
rssChannel.LastBuildDate = DateTime.Now;
rssChannel.Description = "Contains the latest Alerts"

// query – you might want to change the severity
string mySelectQuery = "SELECT ComputerName, Name, Severity, TimeRaised, RepeatCount, GUID FROM dbo.SDKAlertView WHERE Severity > 10 AND ResolutionState < 255"

// SQL Connection – must change SQL server, user name and password
SqlConnection conn = new SqlConnection("Data Source=;Initial Catalog=OnePoint;User ID=rss;Password=rss");
SqlDataReader rdr = null;

SqlCommand cmd = new SqlCommand(mySelectQuery, conn);
rdr = cmd.ExecuteReader();
while (rdr.Read())
RssItem rssItem = new RssItem();
string titleField = rdr[1].ToString();
rssItem.Title = titleField;
string url = webconsoleaddress + rdr[5];
rssItem.Link = new Uri(url.ToString());
string description = "<![CDATA[ <p><a xhref=\"" + rssItem.Link + "\">" + rdr[1] + " </a></p><br>" + "<br>Computer: " + rdr[0] + "<br>Repeat Count: " + rdr[4] + "<BR>Original ALert Time: " + rdr[3];
rssItem.Description = description;

// Finalizza il feed
RssFeed rssFeed = new RssFeed();
Response.ContentType = "text/xml"
Response.ExpiresAbsolute = DateTime.MinValue;
if (rdr != null)

if (conn != null)


Friday, November 24th, 2006

[Edited again 25th November – Jachym gave me some suggestions and insights on the use of parameters, and I slightly changed/fixed the original code I had posted yesterday. There are still some more things that could be improved, of course, but I'll leave them to the future, next time I'll have time fot it (who knows when that will be?)]

This one is a post regarding my first test writing a cmdlet for PowerShell. After a few days since having change my blog's title to "$daniele.rant | Out-Blog" (where Out-Blog was a fantasy cmdlet name, and the title just meant to mimick PowerShell syntax in a funny way), I stumbled across this wonderful blog post: that describes how to use the assemblies of "Windows Live Writer". Then I saw the light: I could actually implement an "Out-Blog" cmdlet. I am not sure what this could be useful for… but I thought it was funny to experiment with. I followed the HOW TO information on this other blog post to guide me through the coding:

The result is the code that follows. you see is pretty much Boschin's code wrapped into a cmdlet class. Nothing fancy. Just a test. I thought someone might find it interesting. It is provided "AS IS", mainly for educational purpose (MINE, only mine…. I'm the one whose education is being improved, not you :-))



using System;

using System.Collections.Generic;

using System.Text;

using System.Management.Automation;

using WindowsLive.Writer.BlogClient.Clients;

using WindowsLive.Writer.BlogClient;

using WindowsLive.Writer.CoreServices;

using WindowsLive.Writer.CoreServices.Settings;

using WindowsLive.Writer.Extensibility.BlogClient;

using Microsoft.Win32;



namespace LiveWriterCmdlet


[Cmdlet("out", "blog", SupportsShouldProcess=true)]


public sealed class OutBlogCmdlet : Cmdlet


[Parameter(Position = 0, Mandatory = true, ValueFromPipeline = false, ValueFromPipelineByPropertyName = true)]


public string Title


get { return _title; }

set { _title = value; }


private string _title;




public string Text


get { return _text; }

set { _text = value; }


private string _text;


[Parameter(Position = 2, Mandatory = true, ValueFromPipeline = false, ValueFromPipelineByPropertyName = true)]


public string BlogApiEndPoint


get { return _blogapiendpoint; }

set { _blogapiendpoint = value; }


private string _blogapiendpoint;


[Parameter(Position = 3, Mandatory = true, ValueFromPipeline = false, ValueFromPipelineByPropertyName = true)]


public string UserName


get { return _username; }

set { _username = value; }


private string _username;


[Parameter(Position = 4, Mandatory = true, ValueFromPipeline = false, ValueFromPipelineByPropertyName = true)]


public string Password


get { return _password; }

set { _password = value; }


private string _password;



[Parameter(Position = 6, Mandatory = false, ValueFromPipeline = false, ValueFromPipelineByPropertyName = true)]


public string ProxyAddress


get { return _proxyaddress; }

set { _proxyaddress = value; }


private string _proxyaddress;


[Parameter(Position = 7, Mandatory = false, ValueFromPipeline = false, ValueFromPipelineByPropertyName = true)]


public int ProxyPort


get { return _proxyport; }

set { _proxyport = value; }


private int _proxyport;


[Parameter(Position = 8, Mandatory = false, ValueFromPipeline = false, ValueFromPipelineByPropertyName = true)]

public string ProxyUserName


get { return _proxyusername; }

set { _proxyusername = value; }


private string _proxyusername;


[Parameter(Position = 9, Mandatory = false, ValueFromPipeline = false, ValueFromPipelineByPropertyName = true)]

public string ProxyPassword


get { return _proxypassword; }

set { _proxypassword = value; }


private string _proxypassword;


[Parameter(Position = 10, Mandatory = false, ValueFromPipeline = false, ValueFromPipelineByPropertyName = true)]

public SwitchParameter Published


get { return _published; }

set { _published = value; }


private bool _published;





protected override void BeginProcessing()






if ((ProxyAddress != null) | (ProxyAddress != ""))


WebProxySettings.ProxyEnabled = true;

WebProxySettings.Hostname = ProxyAddress;

WebProxySettings.Port = ProxyPort;

WebProxySettings.Username = ProxyUserName;

WebProxySettings.Password = ProxyPassword;





WebProxySettings.ProxyEnabled = false;








protected override void ProcessRecord()


if (ShouldProcess(Text))


ISettingsPersister persister = new RegistrySettingsPersister(Registry.CurrentUser, @"Software\Windows Live Writer");

IBlogCredentials credentials = new BlogCredentials(new SettingsPersisterHelper(persister));

IBlogCredentialsAccessor credentialsAccessor = new BlogCredentialsAccessor("dummy-value", credentials);


credentials.Username = UserName;

credentials.Password = Password;


MovableTypeClient client = new MovableTypeClient(new Uri(BlogApiEndPoint), credentialsAccessor, PostFormatOptions.Unknown);



BlogPost MyPost = new BlogPost();

MyPost.Title = Title;

MyPost.Contents = Text;

client.NewPost("dummy-value", MyPost, Published);


WriteVerbose("Posted Successfully.");








Email talk on Port25

Monday, November 20th, 2006

Interesting interview to Eric Allman on Port25.
He talks of the future of email, of SenderID, of sendmail… of openness and interoperation.
Very interesting.
With the change in licensing of SenderID, let's how quick this gets picked up by Wietse Venema

Time Capsule

Thursday, October 12th, 2006

Yahoo has done it again. Yet another cool photographic site: Time Capsule.
They show every day that they really GET the community thing. Thumbs up for them.

On a side note, I honestly ignore why do you need to UPLOAD photos there and you can't just LINK or REFERENCE photos you've already uploaded on Flickr (isn't it a site they bought ?). So I don't see the effort in any INTEGRATION here. I don't get why. If I had a photo platform like Flickr I would use its Web API to let registered users just "PASS OVER" some pictures from one site to the other.

But, hey…. regardless… looks really COOL. It really does. More cool than actually USEFUL (it reminds me of Intrusion Detection Systems… but I digress), but that's how this amount of community things are. It's not useful for your business, but it is good for your heart.

Google has pissed me off this week!

Saturday, October 7th, 2006

Now I pretty much liked GMail and Google in general. But this time they REALLY pissed me off! I will tell you that I am not a google-hater even if I work for a competing company. Of course not everything that Google does is wonderful, but some of their services are really cool and useful and I have never denied to say they rocked when I felt they did.
In general, people seem to love them, and their stock value shows it (with the launch of "Code Search" this week they made a lot of people scream "how cool is this" so that they got back from just under 400 dollars to 417!). But that's not the issue. That is cool, that works. It's ok they make money if they make cool tools. It's fine for me.

In fact i consider GMail as being one of the best interface for reading mail that exist out there – I love "tagging" (oops: it's called "labelling" in their syntax), speed of search through messages (even tough Outlook 2007 is faster on indexed content, but still you have to buy it and install it on your PC)… I also especially love the way it shows THREADING… so that I moved pretty much EVERY mailing list I read on their account:

Ma come se fa ?
(ok, they could do better with the localized version of "Re:" in replies…. in Italian a lot of broken MUA's translate that into "R:" and that isn't understood by GMail and will make it think it is another thread…. but that's a minor issue, and also one that every MUA handling threading has – including "mutt" – the real problem is the broken MUAs sending the "R:" in the first place. But I digress too much….).

I also keep GMail continuosly opened in a browser during the day because a lot of informative mail and that sent by friends goes there. This to say that I do get a lot of their ads (that is – the point of having such an application, for them…). On the contrary, Windows Live Mail reduced its ads to show only one… not to annoy you too much.
But the ads in GMail were not *really* a problem (I don't read them anyway, I just plain IGNORE THEM).

But this week they REALLY pissed me off. They REALLY have. And here is the reason:
I have been using a script for MONTHS to backup my database (the one powering THIS blog) and send it "off-site" to my GMail mailbox. Pretty much something like a lot of other people do, described in various articles and blog posts. Then I was labelling them with a rule, so that I could access my backups easily in case I needed them.

Now I don't know if this violates their terms of use in any way… because I am not really using it as storage with those programs that circulated at one stage that had "reverse engineered" it. Those were bypassing the web interface altogether so people did use it as storage with a program without having to see their ads. That was the issue, I think. In my case, I am just sending MAILS to myself. One per day. I also delete the old ones every now and then, and they are not even huge in sized (attachments of 40 to 50KB so far!!)… anyway, I know a lot of people that store documents and all sort of stuff even in their corporate mailboxes in Outlook (then maybe index them with Windows Desktop Search of Google Desktop to find it back)… I was only doing the same with GMail. I don't see the big issue here….. they might think otherwise…. but from what happens I don't think that's the issue.

Anyway, now it's been three or four days that my backup mail gets rejected. My SMTP Server gets told:

host[] said:
550-5.7.1 Our system has detected an unusual amount of unsolicited
550-5.7.1 mail originating from your IP address. To protect our
550-5.7.1 users from spam, mail sent from your IP address has been
550-5.7.1 rejected. Please visit
550-5.7.1 to review
550 5.7.1 our Bulk Email Senders Guidelines.

Now for fuck's sake. You know how much I hate SPAMMERS and what I would like to do with them. But I also know that it does happen to end up in RBLs and such sometimes. Fine. But GIVE ME a way to tell you that I am NOT one! If you go to the link above, all you find is a form where you can specify that mail that ended up in your "junk" folder actually wasn't spam. Yeah, right. In my case it does not even go into my "junk" folder! How am I supposed to give me the original header that arrived to THEM if I only have the one sent by my mailserver ? They just blacklisted my mail server's IP Address! As they say, I even have an SPF record, I always use the same address, etc….
So I tried to fill in the form, the day after I also tried to contact their and addresses.
Still nothing.
They even tell you (in the automated reply when you contact "abuse":
"[…] For privacy and security reasons, we may not reveal the final outcome of an abuse case to the person who reported it. […]".
How great. How am I supposed to know if they even READ my complaint ?

You anti-spam people at GMail: "I am NOT a fucking spammer!!!!!". I 'haven't found a better way to tell ya this, you know, than writing it on my blog… this is just RIDICULOUS!

But to date my mails still get dropped. I'll probably have to send my backups somewhere else. At this point they pissed me off so much that I am also seriously considering getting back to use my own mailserver also for receiving and reading my mailing lists. Then I won't get ads there.
(I hope you have some dutch guy on board at Google, as "Google Translate" does not translate from/to dutch yet…. )

Edited on October, 8th – While GMail REJECTS those mails (it SAYS it is not accepting them), Hotmail simply DROPS them (that is: it does not even SAY it is not accepting them):

to=,[], delay=3, status=sent (250 <> Queued mail for delivery)

This way you THINK it is going to be delivered, but it NEVER shows up in your inbox. I don't know who's behaving the worst…

A visual conversation

Friday, August 25th, 2006

"[…] But now I’ve come to realize that Flickr is so much more. It’s not just a cleverly designed web application. It’s a repository of human knowledge and creativity organized organically. It’s a visual conversation. It’s countless stories intertwined. It’s a community. It’s a virtual world. It’s a massively multiplayer online role-playing game. […]"
excerpt from:

This guy is right. Stephan, you really got it – and you described it well too.
I am just crazy about this Flickr thing. It's the HUMAN and collaboration features that make Flickr that cool, addictive and popular. Those really make it emerge over ANY other photo-gallery software or service available.
That is the reason why I use it (and I pay it) even if it has been bought by a competitor of my company, even if I have my own server where I could indipendently publish my galleries at no cost, even if…. [insert random reason here about why I should not be using it]. It is for those "countless stories intertwined" that I like it so much.

Windows Vista 5472

Monday, August 21st, 2006


Cool. Transparency in this RC1 build finally works on my laptop. I mean, even with a DECENT resolution.
Everything is much more stable than in the previous beta1 and beta2 builds.

And yes, the background image I am using is this photo of mine.

How programs can teach each other

Sunday, August 20th, 2006

This article shows an intersting (interesting because it is simple but effective!) approach to train SpamAssassing Bayesian spam filter by leveraging the training data in Thunderbird bayesian filter. Basically you can use a program to teach another program how to work better!
This paradigm is cool!

Some people are doing new things

Monday, June 19th, 2006

Playing in a band in Rome ? Want to get the best people to help you record your music ? Some friends of mine have opened a recording studio: Monkey Studio.
Monkey Studio

Also, my dad started leading some turistic trips and excursions with an association of friends. If you want to visit Rome and have a great turist guide who knows what he talks about, give them a try! The association also leads some trips in the countryside, to enjoy the nature.

Roma Wireless

Sunday, January 15th, 2006
Roma wireless

Seems like the Rome council is setting up public wifi hotspots in parks!
I haven't personally tried if they actually even *work* but nonethless this is cool :-)
I even found official information about this:…

Java… oh Java… (aka "High vs. Low level languages rant")

Monday, January 2nd, 2006

I said here (and someone else said that too) that "Java is the new cobol".
When saying so, I mentioned that En3pY hates Java, here it is another post by him written after I forwarded him this Joel Article (which I read from Scoble, in turn).

All in all, in this case, I tend to partially agree on some points but slightly disagree on others with Joel.

In fact, while I do acknowledge the need of "hardcore" developers to fix and build lower level things and mantain current code (and know WHAT they are doing), there are also many cases where coding in a high level language which abstracts complexity IS actually more efficient and cost effective, not having to reinvent the wheel every time.
So there are a lot of useful and nice programs written by people who DO KNOW what happens under the hood (as good in C as in Assembler), that for simplicity and flexibility run in sandboxes, high level languages, even interpreted ones! An example is Dave Aitel's CANVAS, written in Python. But that's just an example.

But I do agree with En3pY that I don't like Java myself, and I consider it being too "heavy", in general.
Solution on my side, tough, is that you don't need C or assembler to get cleaner, smaller, more efficient code, you just need better languages. An example of this is a situation I have been involved in some time ago: in that case a colleague (that works with a very large customer who has a very large exchange deployment) needed to do some performance testing of this Exchange system. He had done the testing from some Windows IMAP clients, but the customer also wanted to see the same performance values measured from a Linux box accessing the same exchange via the very same IMAP protocol.
So I wrote a nice and sweet Ruby script – and at the same time another colleague developer a similar application (in Java).
Result: 45 kilobytes of .JAR to do the same things I did in 20 lines of Ruby (20 lines – including comments!).

DIG on Windows (vs NSLOOKUP)

Saturday, December 24th, 2005

Some time ago (actually quite a while – but I don't really get the time to blog sometimes…. you must have noticed that since I am blogging now that's Xmas holiday…. which is insane on its own, but that's another story), thanks to Peter Provost's blog I spotted NetDIG – available at !

I don't usually cross-post many links found elsewhere, but this one… I just had to.
I am a "command-line-guy", when possible. I like command-line power. So I usually hang around with both "Services For Unix" installed, plus a collection of other unix-like external tools and external/add-on CLI commands for doing all sort of things on my laptop….
…waiting for MONAD ( But I've got the beta running.

So this nice port of "dig" was missing in my collection… and I was stuck with nslookup when it came down to solve DNS issues from Windows… now I have a "dig" implementation on Windows too. Awesome. In fact I'd always wondered why does Windows to date only comes with nslookup which is deprecated and considered a "legacy" thing on UNIX ??

SOTM30 !

Thursday, April 15th, 2004

After many sleepless nights of work in March…. and a bit of waiting… I am PROUD to see the work I did with IT Virtual Community published as the FIRST one on !

You can check it out on
There is another mirror here

In particular I am glad to read that "[…] We received fewer submissions then usual (6), but these submission were all extremely well done, some of the best we have seen. We highly recommend you check it out. […]" ! It means our job was really appreciated !

A great thanks goes to Anton Chuvakin who sponsored the SOTM, and another thanks goes to everybody who partecipated/helped in making this possible (BESA, Brennan, En3pY, Antos, Max, etc etc) !

Scan of the Month 29 – Honeynet

Saturday, November 1st, 2003

Yesterday I have been very proud of myself when I saw my writeup for the 'Scan of the Month' Forensic Analisys being actually published on !
This means that even if I would have liked to make it better and more complete, it wasn't that bad in the end!
I wish to thank a lot the members of The Honeynet Project for this great opportunity to learn that they set up for everybody in the security community, and I also wish to thank Brennan Bakke of GMTECH for his insight on the ext2/3 filesystem: they put me on the right way to solve this puzzle the way I did.
A huge thanks goes to my wife for always leaving me the time to be 'geek'. No woman could understand me better.
And of course thanks to the other guys at ITVC for encouraging me with in writing this writeup.