Saturday, December 29, 2012

Everything should be everyone's business!

When working with more than 2-3 developers on a big project, you can't always be aware of everything that happens around you... Never the less, you should and even need to!

What am I talking about ?
You should be aware of everything that happens on your project - from open bugs, to the build server status, the site/product performance, to new implemented features, errors in the log, and the list goes on...
Every aspect of YOUR project should interest on a daily basis!

Why is this hard ?
When working with many people, on a big project, you can't really go over all the commits, the new features, always check the performance graphs, search the logs for errors and QA the site for new bugs.

How is this done ?
Sometimes all you need to do, is make things a little easier for everyone. What do I mean by that ? I mean that sometimes you need to give developers the right tools to view the logs, sometimes you need to put up a new monitor in your company hallway and display the performance graphs for people to see each time they go for a coffee break and sometimes you need to build a system that will send automatic emails to each developer that breaks something to get their attention...

A couple of things I am working on in my company to make things more transparent and 'in their face' :
- We have a custom built internal performance profiler that we use a lot. It looks like the mvc-mini-profiler (built by StackOverflow) and is displayed when adding a secret cookie. This wasn't good enough for me! I wanted everyone to be seeing this all the time. So, I modified it to be displayed for everyone coming from inside our office networks.

Numbers are pixelated intentionally

- Performance graphs... We heavily use graphite, and store there TONS of data... The problem is, it's sometimes hard to look for exactly what you need, and it's not that comfortable. This also is unacceptable... What I did was create a really nice looking dashboard with very little information, but just enough to see what's important, and soon we're going to hang it in the hallway for everyone to see all the time!


Numbers are pixelated intentionally

- Bugs... Sometimes people run into them, but it's not that easy taking care of them... I created a button that will be rendered on our site, for everyone within the network, and clicking on it will allow you to report a bug really easily - you just need to enter a title for the bug, and the system will add all other relevant information (a screenshot, what version, which user, errors previously caught, and so on). This will automatically open a bug in our bug system.


- Logs - We have an internal utility that reads the log every day and checks for the top erroneous components. I decided I needed to put this up on another monitor in the hallway, and next to each component I put the name of the person responsible for everyone to see! :) No one wants to be on this monitor, and therefore this is bound to work...


I strongly believe in this whole philosophy, and won't quit until the cleaning lady will be calling me in the morning to tell me that the monitor is red! :)

Sunday, December 16, 2012

I'm back...


I didn't write any posts for quite a while and I'm really disappointed about that...
Most of the things I do, or try to do, on a regular basis turn into habits that are eventually just part of my system. Every once in a while when something unusual happens that 'knocks me off the wagon' I realize it's hard to get back to doing it. Eventually I force myself to get back to it, and then everything gets back to normal...

In this case, blogging - I have abandoned it for quite a while now. It started a couple of months ago when I started preparing myself for a month long trip to Japan, and since I returned I haven't got to it.

So... sorry for disappointing you, the reader, if you were expecting something technical, or computer related, but I just felt the need to write something here just to get the ball rolling again.

And since I'm here, and I already told you about my trip to Japan, I thought I'll at least post some photos to feast your eyes on... :)


The character on the billboard is one of the most famous singers in Japan. Young teenagers buy tickets in advance and fill up entire stadiums just to watch 'her' sing on a giant screen!


Used laptops going for really cheap...


Typical Japanese cosplay @ Takeshita Dori


Some interesting architecture... (Maybe this building was straight before all the earthquakes! )


The average grocery store filled with lines of people reading the new weekly manga


Enjoy :)
I'll be back soon with something more technical for you...

Tuesday, July 31, 2012

Sears Israel 2012 Hackathon

As my first hackathon, I must say it was quite awesome!...
For two days at work, everyone was crunching at their projects, trying to get as much of it done as humanly possible in that short period of time. I stayed pretty late those days, and even worked a little more on the weekend after, as some others did too.

Some of the projects included creating apps on top of the application framework we have on our site (which is kinda like building a facebook app - You are hosted in an iframe on the site and can work against the site's api).
I created a 'polyvore' like app - It lets you create a collage of products on a canvas for others to see (mainly intended for women's use).
Some other cool apps were games like "The price is right" for products on our site, or an item tagging compatibility game where you need to tag items identically with a partner to score points.
There were some actual features developed for the site, like a 'GeoFeed' - The capability of presenting a users newsfeed on a map, with relevant feeds in his area. Some people took the time to install or implement tools that will help us get our job done better daily, like a cool dynamic dashboard that can contain almost any information we want, or installing an IRC server that will give us a better communication channel for development issues than email.

Whatever the project was - I think it was fun for everyone working on them, and very thoughtful of the company to hold such an event...

Here are just some cool pics from the hackathon -

Me working on my hackathon app...


Our R&D VP working on his app...


The team in Chicago working on their app...


It's not a real hackathon, unless you have hackathon water!


...and lots of energy drinks!


...and a big hackathon timer on the hallway monitors :)

Sunday, July 8, 2012

Always wear sunscreen while maintaining a blog


Many people think that maintaining a blog is a lot of work. I think saying that is like saying reading the paper is a lot of work - You need to go out, buy it, free some time and actually read it. ...But that's not the reality of it, you just grab it during some breaks you have during the day, and read a little...
No commitment, no strings attached and nothing happens if you don't finish the article you started...

The same should go for a blog -
You don't need to make a fuss about it and you don't need to free any time for it.
Just start one, and whenever you have something on your mind - something new you learned, something you feel like expressing, then just jot it down.

Don't think about what others will think.
Don't think about what will happen if you never finish the post.
Just start typing...

Now, when there are so many easy to use and free tools to get the job done, it should be happening 100 times more than it actually is!...

Just write it for yourself. If someone else happens to learn from it, or connect to it in any way, well, then that's just an added bonus.
You'd be surprised by the amount of knowledge you'll be learning by publishing one on your own.
Even if you only ever have one or two good posts, it's worth having them out there for others to read, but that should never be what it's about.

Don't look at how many people read your blog.
Don't think you have to be an expert to start one.
Don't believe everything you read in other blogs, and don't think that every blog is written by some expert.
Read a lot of blogs yourself, and comment when you feel like you have something to say on the matter.
You'd be surprised by the amount of comments you'll be getting (or not getting) on your posts...

People don't start reading your blog just because you posted something. They'll read it because they were looking for something that you posted about. That might happen today, maybe tomorrow and maybe only in 3 years from now...
They won't stop reading it just because you don't post anything either...

It's all about saying what you have to say, whenever you feel like it. Nothing more.
So just do it, have fun doing it, and always wear sunscreen.

:)
Gilly.

Sunday, June 24, 2012

Running UI tests on local asp.net website with development server


These past couple of days I started writing a utility program that will invoke some UI tests against a certain website of my choice. Obviously, I needed to write tests for the functionality of this utility, so I will be sure that it works.
I wanted the tests to be as real as possible, and so I decided to run them against a real site, like google (could've been any other site, this doesn't really matter for the sake of the story).

The tests worked great, but then I realized that they could easily be broken, with no connection to my work - No one promises me that google's page structure (or anything I want to test) will always stay the same, and I don't even know which tests i'll want to add to it in the future, and if I'll have all the resources on the site of my choice.

I decided that I'll create a custom asp.net mvc site for the sake of my tests.
At first I configured the site on IIS locally and this was great, but then I started thinking another step forward, and realized I have another problem - I want someone else to be able to download the source code, and run the tests immediately without having to create a local website and going through all the proper configurations.

I realized that I could use the asp.net development server for this.
First, I needed to give the site a static port to run on in the development server :
(In the project properties page, go to the 'Web' tab, and mark 'Use Visual Studio Development Server' -> 'Specific Port')

Then, I created a class called 'DevelopmentServer' responsible for loading the asp.net development server, and shutting it down at the end :
public class DevelopmentServer : IDisposable
{
    private readonly Process _devServer;

    public static int Port = 1212;
    private string _devServerExe = @"C:\Program Files (x86)\Common Files\microsoft shared\DevServer\10.0\WebDev.WebServer40.EXE";
    private string _testSitePath = @"C:\Dev\MySitePath\";

    public DevelopmentServer()
    {
        _devServer = new Process {
            StartInfo = { 
                FileName = _devServerExe,
                Arguments = string.Format("/port:{0} /path:{1}", Port, _testSitePath)
            }
        };

        try
        {
            _devServer.Start();
        }
        catch
        {
            Console.WriteLine("Unable to start development server");
            throw;
        }
    }

    public void Dispose()
    {
        _devServer.Kill();
        _devServer.Dispose();
    }
}
The port number and the library paths are just hard coded for the sake of the example. It makes more sense to put them in some configuration file of your choice.
(Note: You might have to change the path to the file WebDev.WebServer40.exe which is the binary of the development server, according to the version you have. The earlier version of this file is called WebDev.WebServer20.exe)

Finally, I just created an instance of this class upon FixtureSetUp, and disposed of it on FixtureTearDown.

Thursday, June 21, 2012

Redirecting external Process output to Console.Writeline (or elsewhere)


As part of some code I was writing recently, At some point my app needed to trigger some other command line utility. I did this easily, using the Process class like this :
var myProcess = new Process {
    StartInfo = {
        FileName = "C:\\path\\to\\my\\cmd\\utility.exe",
        Arguments = " --some --random --args"
    }
};
myProcess.Start();

This was working great, with one exception - Sometimes the process was throwing an error, causing the console window to close immediately, and I didn't have time to view this error.
I knew that you can tell there was an error by the process's exit code (myProcess.ExitCode), but while debugging it was important to know what error was happening and actually see the output of this process.

Digging a little into the Process class, I easily found that you can redirect the process's output elsewhere. You just need to add :
// This needs to be set to false, in order to actually redirect the standard shell output
myProcess.StartInfo.UseShellExecute = false;
myProcess.StartInfo.RedirectStandardOutput = true;

// This is the event that is triggered when output data is received.
// I changed this to Console.WriteLine() - you can use whatever you want basically...
myProcess.OutputDataReceived += (sender, args) => Console.WriteLine(args.Data);

myProcess.Start();

myProcess.BeginOutputReadLine(); // without this, the OutputDataReceived event won't ever be triggered

That's it! Now i was getting all I needed from the running process, and it was much easier to find the problem this way. :)

Enjoy :)

Saturday, June 16, 2012

Having fun web crawling with phantomJs


A couple of weeks ago, a colleague of mine showed me this cool tool called phantomJs.
This is a headless browser, that can receive javascript to do almost anything you would want from a regular browser, just without rendering anything to the screen.

This could be really useful for tasks like running ui tests on a project you created, or crawling a set of web pages looking for something.

...So, this is exactly what i did!
There's a great site I know of that has a ton of great ebooks ready to download, but the problem is that they show you only 2 results on each page, and the search never finds anything!

Realizing that this site has a very simple url structure (e.g.: website/page/#), I just created a quick javascript file, telling phantomjs to go through the first 50 pages and search for a list of keywords that interest me. If i find something interesting, it saves the name of the book along with the page link into a text file so i can download them all later. :)

Here's the script :
var page;
var fs = require('fs');
var pageCount = 0;

scanPage(pageCount);

function scanPage(pageIndex) {
 // dispose of page before moving on
 if (typeof page !== 'undefined')
  page.release();

 // dispose of phantomjs if we're done
 if (pageIndex > 50) {
  phantom.exit();
  return;
 }

 pageIndex++;
 
 // start crawling...
 page = require('webpage').create();
 var currentPage = 'your-favorite-ebook-site-goes-here/page/' + pageIndex;
 page.open(currentPage, function(status) {
  if (status === 'success') {
   window.setTimeout(function() {
    console.log('crawling page ' + pageIndex);
    
    var booksNames = page.evaluate(function() {
     // there are 2 book titles on each page, just put these in an array
     return [ $($('h2 a')[0]).attr('title'), $($('h2 a')[1]).attr('title') ];
    });
    checkBookName(booksNames[0], currentPage);
    checkBookName(booksNames[1], currentPage);
    
    scanPage(pageIndex);
   }, 3000);
  }
  else {
   console.log('error crawling page ' + pageIndex);
   page.release();
  }
 });
}

// checks for interesting keywords in the book title,
// and saves the link for us if necessary
function checkBookName(bookTitle, bookLink) {
 var interestingKeywords = ['C#','java','nhibernate','windsor','ioc','dependency injection',
  'inversion of control','mysql'];
 for (var i=0; i<interestingKeywords.length; i++) {
  if (bookTitle.toLowerCase().indexOf(interestingKeywords[i]) !== -1) {
   // save the book title and link
   var a = bookTitle + ' => ' + bookLink + ';';
   fs.write('books.txt', a, 'a');
   console.log(a);
   break;
  }
 }
}

And this is what the script looks like, when running :
Just some notes on the script :
  • I added comments to try to make it as clear as possible. Feel free to contact me if it isn't.
  • I hid the real website name from the script for obvious reasons. This technique could be useful for a variety of things, but you should check first about legality issues.
  • I also added an interval of 3 seconds between each website crawl. Another precaution from putting too much load on their site.

In order to use this script, or something like it, just go to the phantomjs homepage, download it, and run this at the command line :
C:\your-phantomjs-lib\phantomjs your-script.js

Enjoy! :)

Friday, June 8, 2012

My opinion on GIT vs. SVN


I finally decided to convert to git...
Yes, this sounds like a religious statement (just like saying "i'm converting to christianity, judaism or islam") because it is!

Let me give some background -
At my work, we have a main svn repository, and we used to all use subversion (with Tortoise SVN and ankhSvn for VS compatibility). This was all great, until one day, some people decided that git is so much better than svn and they started convincing the others to try it out. So some of us did (or more accurately, attempted to try it out) by installing git extensions. In case you're not familiar, git extensions is a windows gui for git that is extendable so there's a plugin called git-svn that allows you to work git-based while actually having a svn server in the background. I went along and did just the same a couple of weeks ago. I also hosted a personal project on github to get the feeling of working with a real git repository (and not just using git-svn).

I must say, that working with git, is great!
...But also working with svn was great!
If i try to look back on it, I don't really remember the same people that are fans of git complaining about using svn the way they "claim" to have been complaining about it now. I think their more in love with the "coolness" of using git, since it seems like the cooler trend now-a-days.
Github has become so popular lately, and not for nothing - Their website is really easy and comfortable to use, and they have great tools for socializing and communicating on distributed projects over the internet. With that said, if they were hosting svn as well, I don't think it would be that big a difference.

All in all, git is great, but has it's share of problems. Svn has it's totally different share of problems as well. If there's a really good reason for me convincing you to start using git, it's because I truly believe that great developers should be well aware of the advantages and disadvantages each tool set gives you and that it's always good learning to use a new tool every once in a while. That's the only way to make the decision on which tools are best for you or your project.

Git and SVN are completely different in how they work, and it's really interesting to dig deep in to..
Here's a great post I read a while ago, that explains it really good : Understanding Distributed Version Control Systems

Monday, May 28, 2012

Sears Israel - Upcoming Hackathon...


A year ago, which was before I starting work at Sears, they had a 2-day 'dojo'. Everyone in the company grouped into teams and worked on projects that interested them the most. These projects weren't necessarily connected to what we do as a company, but just projects that we want to do for fun, and some of them are actually being used by people in the company till this day!
Until this day I am still hearing about how much fun it was, and how people are waiting for another event like the dojo.

The time has finally come, and in about a month we are having another event just like this. This time it is a 4 day hackathon, which is supposed to be about building applications on top of our platform for our site (www.shopyourway.com) or about building internal tools that will help us in the process of creation in the future. We are already talking about how we're going to build the teams and what apps we're going to build and all the buzz around the event is really exciting... I love it when we have a chance to express ourselves in a little more freely manner then usual, and I think this is essential in developing the people you work with in so many ways! :)

I thought I'd just post here some articles about different hackathons in some other cool companies around the globe :
- Facebook's hackathon page
- Twitter hack week
- Feedburner's hackathon
- Dropbox hack week
- Google's 20 percent time (which is basically like an ongoing hackathon)

You see... all the cool kids on the block are having them, so now it's our turn too!... :)

A big thank you to all the people running my workplace!!

You can be sure I'll update my blog after the event, and tell you about the project I worked on...

Thursday, April 26, 2012

Solving the "Move the Box" game (Programatically)


A couple of weeks ago some colleagues at work showed me this nice puzzle game called "Move the Box". I've seen dozens of these kinds of puzzle games in different variations, and as usual, I got hooked on it for a while.

I got stuck on a level I couldn't pass a couple of days ago, and this got my programming head thinking...
Hmmm... The board is only 6 by 7 tiles, and the levels only have up to 3 moves maximum... This shouldn't take too long for my computer to brute force through all the levels... :)

So, I got to work, and pretty easily got it solving the puzzles.
It works good on the puzzles i tested it on, which include puzzles with 1, 2 and 3 possible moves to the solution. The annoying part is that you have to enter the whole map of the level you want in a two dimensional array, which can take a long time. Then, all you have to do is tell it how many moves it gets, and hit F5!

Here's a screenshot of what I see when the code solves my puzzle -
Spoiler - This is the solution to the level "Osaka 22"

I put the code up on google code's project hosting if you wanna take a look, but this should come with a disclaimer -
The code is really ugly and hacky! This is only because it is a 'coding for fun' project at home, for something that will never reach any kind of production. The only thing that was important to me at the time coding this was the fact that it will work. I don't see myself using this ever again.
With that said, the logic of the board was still coded using classic TDD. It just seemed logical doing it that way, so it even has some cool tests.

http://code.google.com/p/move-the-box-solution/

...and if you're wondering, the answer is "yes" - This was a stupid idea, since I basically took all the fun out of playing this game for me! :)


Enjoy :)

Tuesday, April 17, 2012

Am I a good enough programmer ?

Ever since I started programming, this question has been bugging me. Now, I am trying to reflect on it from a different angle and I'm trying to understand what this even means to me and if in fact I am a "good enough programmer".

One of the most important things I learned a couple of years ago (from my team leader at the time), was that you can't define a goal without having a well-defined system to measure the success of reaching it. So, I must first define what being 'good enough' means to me.
In order to do that, I need to go back to the beginning...

The beginning
I started programming when I was about 12 years old (around the year 2000). It was building small applications for my first PC in Visual Basic for a short period, and about a year after that I started building websites using PHP and MySQL. Back in the day, I remember myself learning a lot from some books I ordered from amazon and reading easy-to-understand tutorials on the internet. I think at the time I was pretty much a script kiddie. A lot of the sites I built were from tutorials on the internet, and when I ran into problems I would run straight to the internet forums and wait for someone to help me out. I looked up to the ones in the forums answering my questions and publishing all the open source javascript/php components I was using freely, when sometimes I didn't even understand how they worked.
What I remember clearly from that period was wanting to be as good as those 'guys' on the internet - the ones answering my questions and publishing open source. I wanted to be another success story like the guys from Mirabilis or other companies at the time I was reading about during the dot-com bubble period.

Getting better
After a while, I felt like I had a lot of knowledge about PHP and MySQL (boy, was I wrong at the time), and I started building ecommerce websites as a freelancer. I built some nice websites (which some of are running till this date without any need of maintenance) and I felt like I was on top of the world. I was just waiting to finish high-school and make a career out of my favorite hobby!

Working on a team
After high school I served in the Air Force (mandatory in Israel) for 3 and a half years, 2 of those years as a computer programmer. This was the first time working on a team with other developers. Some of these developers had fancy college degrees and some of them had a couple more years of experience than I did. This blew my mind away! All of a sudden I felt like I met a new species of programmers I didn't know before. It seemed like these guys didn't only know how to program but they had a deep sense of knowledge of what their code means and how it was affecting the CPU. They all had their own blogs and every week one of them would present a lecture, talking about something new and teaching the team about new technologies.
I was totally blown away by this, and this convinced me that I wasn't even close to where I wanna be as a programmer...

Getting even better
I started reading a shit load of programming books, subscribing to some blogs on the internet and started diving into some leading open source code projects that were popular at the time. I even started a blog on my own as a way of learning more myself. During this period I learned so much and looking back at it, I would've never reached the amount of knowledge I know today without working with other people better than me (just a side note to others). By the end of those two years, I even managed to do some lectures myself and I think in some of them I got to teach others something new.

The 'real' world
My first 'real' job was working for an Israeli credit card company. At first I felt intimidated by this, since I didn't know if I gained enough experience to be working in a big company, so I continued doing what I learned to do. Learn, learn and learn! I continued reading a lot of blogs and learning as much as I can, so I can be as good as the best at my current workplace. That didn't take long and I became one of the best there. I soon became bored and started looking for a new job...
I interviewed at a couple of places but only had a connection with one of them. (I wrote about this in a previous post : A new direction) This is Sears Israel (SHC), which has been my current workplace for the past 6-7 months. Not surprisingly, here I met programmers that are even better than I ever met before, and immediately wanted to go in their footsteps.

I could go on about how I continued learning as much as I can, but I think you get the point (and looking back at how much I wrote, I doubt anyone would read this whole post!)...
So now I look back at that feeling I had when I first started out programming, not knowing how good I wanted to be, just knowing I want to be better, and I think today I can say that I feel almost the same way. A couple of times already, I tried to define goals as to 'how good' I want to be, but these definitions never hold, since there are always new technologies out there to conquer and more programming languages to learn. The second I feel like I reached that point, I just find something new I want to learn and think I'm not good enough since I don't know that yet...
This is probably one of the reasons I am still in this field of work. The challenges are never over, and your work is never complete.

Today, I can appreciate the fact that I know that I am working amongst some of the more talented programmers in Israel.
I know I'm not the best, but I am still striving to get there! I still don't know what being the best means, and if I'll ever get there, but I do know that I'm enjoying the whole process. I know as a fact I am getting better all the time and I know that just the fact of wanting to be better and doing something about it makes me better than most.

I know I am a whole lot better than I used to be just a couple of years ago, and I'm pretty sure that in a couple of years from today I will be able to say the same again. This thought gives me some peace :) and is VERY satisfying by itself!

Part of the reason I am writing this post is to be a message to myself and to others - You are not the best (this goes for 99.99% of potential readers), but (and this is a big 'but') you should always strive to be the best. Keep reading, learning and programming as much as you can and maybe one day you will be. Doing this will keep you in a good position and on the way to getting there.

Tuesday, March 6, 2012

Cleaning up that ugly client side aspx code


When managing a big web project, you might find yourself ending up with a big master page, that has a bunch of centralized code client side logic, like includes. What I mean by 'includes', is the part of the code where you spit out all the declarations for js and css files. I got to this stage many times by myself, and i've seen it happen at my various work places as well.

This usually ends up to some big nasty chunk of code that looks like this (and this is a relatively small example of what i mean...) :
<link rel="stylesheet" href="../../Content/bootstrap.css" />
<link rel="stylesheet" href="../../Content/common.css" />
<link rel="stylesheet" href="../../Content/widgets.css" />

<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js" type="text/javascript"></script>
<script src="@Url.Content("~/Scripts/bootstrap/bootstrap-dropdown.js")" type="text/javascript"></script>
<script src="@Url.Content("~/Scripts/common.js")" type="text/javascript"></script>
<script src="@Url.Content("~/Scripts/homepage.js")" type="text/javascript"></script>
<script src="@Url.Content("~/Scripts/add-form.js")" type="text/javascript"></script>
<script src="@Url.Content("~/Scripts/search-form.js")" type="text/javascript"></script>
<script src="@Url.Content("~/Scripts/bookmark-list.js")" type="text/javascript"></script>
<script src="@Url.Content("~/Scripts/register.js")" type="text/javascript"></script>
<script src="@Url.Content("~/Scripts/openid.js")" type="text/javascript"></script>


Why is this bad ?
For starters, there's so much similar text here, that typing it all up should've ringed a huge bell for us, screaming "THIS IS WRONG!!".
The clutter of text here, bothers us from seeing what's really important here, and all that is, is the files being included here.

Why does this happen ?
The main reason for this is because we don't think of client side code as 'actual' code. This leads to us forgetting about DRYing up our code. The equivalent of this in server side code would be to copy the whole body of a certain method a dozen times, and just changing one parameter between different versions. If this were server code, we would've easily spotted the code duplication, and extracted this to a method that receives a parameter and saved a lot of typing, and kept the maintainability of the code.

How do we fix this ?
All we need to do is introduce a small helper method that will print this again and again for us. We can do this on the client and on the server. In this case, I would go with doing this on the client, just because this piece of code will probably only be useful for us in this specific context.

Here is my clean solution to this problem :
@{ Func<string, string> JsRequire = s => "<script src=\"" + Url.Content(s) + "\" type=\"text/javascript\"></script>"; }
@{ Func<string, string> CssRequire = s => "<script rel=\"stylesheet\" href=\"" + s + "\" />"; }

@CssRequire("../../Content/bootstrap.css")
@CssRequire("../../Content/common.css")
@CssRequire("../../Content/widgets.css")

@JsRequire("https://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js")
@JsRequire("~/Scripts/bootstrap/bootstrap-dropdown.js")
@JsRequire("~/Scripts/common.js")
@JsRequire("~/Scripts/homepage.js")
@JsRequire("~/Scripts/add-form.js")
@JsRequire("~/Scripts/search-form.js")
@JsRequire("~/Scripts/bookmark-list.js")
@JsRequire("~/Scripts/register.js")
@JsRequire("~/Scripts/openid.js")


Sunday, March 4, 2012

jQuery relative position plugin - nextTo


The Problem :
I've already created quite a few jQuery plugins in the past, at work, and for personal use, and in many of them there are certain parts of code that always tend to repeat themselves.

One of these parts of code has to do with element positioning calculations relative to another element.
For example, when creating a plugin for a drop-down menu, or a tooltip, you can't avoid having a nasty piece of code in there, that all it does is calculate the element's position relative to the element we clicked on or hovered above.

I don't think there's any need for me to post an example of this, you probably know what I mean. This is usually the least most maintainable part of code in the plugin and the least easy to understand.

The Solution :
I finally decided to extract this ugly piece of code into a nice jQuery plugin that will hold all the dirty work calculations, and will leave you with a nice clean and understandable piece of code inside your plugin.

<scrip type="text/javascript">
    $(function() {
        $('.PutThisDiv').nextTo('.ThisOtherDiv', {position:'right', shareBorder:'top'});
    });
</script>


This plugin is hosted on google code : https://code.google.com/p/next-to/ (project name: 'next-to')
At the project page you will find more sample usages, usage explanations, the source code and a minified version.

Saturday, March 3, 2012

FireSheep version 2.0


FireSheep version 1.0
I think about two years ago I read about the FireSheep firefox plugin that allows you to hijack any user's account to many different sites (Facebook, flickr, twitter, etc.) that is surfing on the same wifi connection that you are using. This can be extremely brutal to use in any coffee shop, hotel, airport, just sitting outside someone's house stalking them, whatever...
The point is, the person who created this, Eric Butler, didn't do this as a hacking tool, but as a wake-up call to all the sites that aren't encrypting there connection via SSL, and a lot of them didn't even change that since...

FireSheep in action...

The potential danger
The second I read about this, I just couldn't stop thinking about what a dangerous tool this can become. Imagine this - Someone expands this tool to send all the currently active session cookies in the current wifi network to an online database, and now all the active sessions from all the firesheep users are shared worldwide. This means that you don't even have to be in the same wifi network as someone else to hijack their account. All you need is for someone else to be there while you're in the comfort of your own home... Isn't the internet a beautiful thing ??? :)

The future...
Two years (maybe more) later, and I'm happy to see that no one did this yet, but I am still very afraid of the day someone will!
I looked at firesheep code a little just out of pure curiosity, but never even downloaded it or tried it myself. I'm not a hacker and not interested in becoming one. The one thing I am concerned about here is my own personal security, so I am still hoping that these sites will improve the security for the sake of their users. Unfortunately, sometimes the only thing that speeds up the process is a lunatic taking advantage of the current situation.

Till then, beware...

Thursday, February 23, 2012

Unit Test looking for missing files in your project


When you're working with a big team of developers and a nice Build Server, a lot of times you rather work on a separate feature branch not to break the build while committing. This is very comfortable since you have your own "work zone" and don't bother anybody else, nor have to worry about breaking the build until you're done with your feature and ready to reintegrate to the trunk (or your main working branch).

But this comes with a price...
It's not always easy dealing with all the conflicts when reintegrating your branch, and sometimes this can lead to problems. One of the main problems we ran into a couple of times recently at my work place while doing this process was that while reintegrating, we missed some new files that were added to the project while dealing with the csproj file conflicts.

This means we had new files added to the project, and were in the project directory, but not in the '.csproj' file. This might be a class that is referenced throughout the project, which in this case you're lucky and the project won't compile, which will immediately let you know there was a problem. But if you're not lucky, this will be a controller class file, or an 'aspx' file, and leaving it out will never cause a problem for the build server (if your project isn't 100% covered with tests - a totally different subject for a whole new post!).

So... what's the easiest way to quickly solve this problem ? Write a test that will check this! :)
That's exactly what I recently did, and now every time this happens the build server will fail, with a nice message that shows you all the files you left out!
(of course, you need to make sure you didn't by accidentally leave this test out of the csproj!! :)

For your unit testing pleasures -

[Test]
public void CheckingThatAllFilesAreInCsProj()
{
    const string projectName = "TestWebsite";
    var projectDirectory = GetProjectDirectory(projectName);
    var projMissingFiles = RetrieveMissingFilesList(projectDirectory.FullName, projectName);

    Assert.IsTrue(projMissingFiles.Count() == 0,
        "There are files that are missing in the .csproj file : \n" + string.Join("\n ", projMissingFiles));
}

private DirectoryInfo GetProjectDirectory(string projectName)
{
    var currentDirectory = new DirectoryInfo(Directory.GetCurrentDirectory());
    while (!currentDirectory.GetDirectories("*", SearchOption.TopDirectoryOnly).Any(x => x.Name == projectName))
    {
        currentDirectory = currentDirectory.Parent;
    }
    return new DirectoryInfo(currentDirectory.FullName + "\\" + projectName);
}

private string[] RetrieveMissingFilesList(string projectDirectory, string projectName)
{
    var csFiles = Directory.GetFiles(projectDirectory, "*.cs", SearchOption.AllDirectories);
    var aspxFiles = Directory.GetFiles(projectDirectory, "*.aspx", SearchOption.AllDirectories);
    var projFiles = csFiles.Union(aspxFiles);

    var csprojDocument = XDocument.Load(projectDirectory + "\\" + projectName + ".csproj");
    var csFileElements = GetCsFileElements(csprojDocument);
    var contentFileElements = GetContentFileElements(csprojDocument);
    var allFileElements = csFileElements.Union(contentFileElements);

    var projMissingFiles = new List<string>();
    foreach (var file in projFiles)
    {
        var stripFileName = file.Replace(projectDirectory, "");

        if (!allFileElements.Any(x => x.Attribute("Include").Value.ToLower() == file.Replace(projectDirectory + "\\", "").ToLower()))
            projMissingFiles.Add(stripFileName);
    }

    return projMissingFiles.ToArray();
}

private static IEnumerable<XElement> GetContentFileElements(XDocument csprojDocument)
{
    return GetFileElements(csprojDocument, "Content");
}

private static IEnumerable<XElement> GetCsFileElements(XDocument csprojDocument)
{
    return GetFileElements(csprojDocument, "Compile");
}

private static IEnumerable<XElement> GetFileElements(XDocument csprojDocument, string elementType)
{
    return csprojDocument.Elements()
        .Where(x => x.Name.LocalName == "Project").Elements()
        .Where(x => x.Name.LocalName == "ItemGroup" &&
            x.HasElements &&
            x.Elements().Where(t => t.Name.LocalName == elementType).Count() > 0).Elements();
}


The code in the test isn't the most beautiful thing I wrote but it does serve a great purpose.
All it does is search the project folder for all .cs and .aspx files, and then dissects the csproj file (which is just a big xml file) to find all the file references.
The code I posted just checks for cs/aspx files, but you can easily add functionality for js, css, html and any other file type you like...

Enjoy :)

Thursday, February 16, 2012

.net to Java transition gets a little easier for me...


I'm taking a university course in Java this semester, so I've been recently writing a lot of Java and as an IDE using IntelliJ (community edition) by JetBrains a lot. As a .net guy, mostly, I'm used to Visual Studio with ReSharper 6 and I remember so many keyboard shortcuts by heart which make my coding work so productive!

So you can guess how hard it is for someone like me, doing the transition to a different IDE, and not having ANY of the shortcuts you're used to. I tried studying some of the intelliJ shortcuts but it just got too annoying, and I finally decided to take the time and set ALL of the IntelliJ shortcuts to the same VS shortcuts once and for all. While I did that, I ran into a nice surprise - The brilliant guys at JetBrains were one step ahead of me the whole time, and have already added a VS keymap option in their settings panel!!!




Thank you JetBrains for making my Java experience just a little bit more comforting! :)

Saturday, February 11, 2012

VS Spell Check Extension - using Roslyn


A couple of months ago Microsoft published the Roslyn CTP which gives us an inside view on the Compilers view to our code. This also comes with project templates for creating CodeIssues and CodeActions (These are code suggestiongs/actions that are available to the coder in the IDE).

Since I read about this it interested me a little, and wanted to play around with this. Since I've been programming in java at home lately due to a course i'm taking at the university, I realized that one of the nice features that IntelliJ has is that it gives you suggestions on spelling mistakes you made on variable names (because this is one of the most annoying things i see in code and i truly believe makes it harder to maintain), so I thought i'd try to implement this in VS as an extension.

It was really simple to do so -

I just created a CodeIssue project :


With the ExportSyntaxNodeCodeIssueProvider attribute, you can tell it to execute the code only on variable declarations :


Split each variable name into separate words (most variables, at least when I write code looks like this "var myReceivedFilesList" so i separate them by case) :
public IEnumerable<CodeIssue> GetIssues(IDocument document, CommonSyntaxNode node, CancellationToken cancellationToken = new CancellationToken())
{
    var variable = node as VariableDeclarationSyntax;
    if (variable != null)
    {
        var nodes = variable.ChildNodes().Where(x => x.Kind == SyntaxKind.VariableDeclarator);

        foreach (var syntaxNode in nodes)
        {
            var fullVariableName = syntaxNode.GetFirstToken().ValueText;

            var wordsInVariableName = fullVariableName.SplitToWords();
  
            if (wordsInVariableName != null)
            {
                var correctSpelling = CheckAndCorrectSpelling(wordsInVariableName);

                if (correctSpelling != null)
                    yield return new CodeIssue(CodeIssue.Severity.Info, syntaxNode.GetFirstToken().Span, "Possible Typo ?\nMaybe you meant : " + correctSpelling);
            }
        }
    }
}


And viola!
You have the spell check feature on variable names in your VS IDE :



For the spell check I used the NHunspell project (This is the .net wrapper to the OpenOffice dictionary).

I put the code up on google code so you can take a look at it, and who knows, maybe even contribute to it... :)
https://code.google.com/p/gb-coding-extension/

I don't know if this is production-worthy code, I didn't even check performance or if it slows down the IDE. This was just a small learning experience for me.
Maybe if I have some more free time, and good ideas, I'll add to it more features in the future...