Recently Ilya Ryzhenkov posted a thread on the ReSharper blog. The article was about how R# 4.0 gives you a suggestion to use implicitly typed locals.
I've been running R# 4.0 nightly builds for a few weeks now and I've noticed these little buggers popping up all over the place.
Here's what it looks like:
Usually, I nearly agree with most of everything the R# team posts and even though sometimes I may disagree with something, I'll find the edge case where it does apply, so we're at a 99% agreement rate with them! Hell, you can't lie about it, R# seriously improves your productivity and it only keeps getting better.
Unfortunately, this is the one time I'm going to fully disagree with this post and actually the suggestion in R# - which is why I have disabled it. You can disable it by going to the R# options:
Why I Disagree
The post stated that "It induces better naming for local variables". I don't know about that. I can still call an apple an orange and call a orange a banana I want. Nothing forces me to do anything. The only thing I know at that point is that it's still an anonymous type. A name is that, just a name. Nothing forces developer x to write a good variable name. I still see Junior developers using wrong variable names. The only thing that's going to help here is a good code review process.
The post then goes into to say that it "It induces a better API". Again, I don't agree. How can this induce a better API? I feel that letting the compiler choose which types it is returning has its validity in certain cases, but not inside of your entire system. It brings back the horror days of VB's "Variant" (and yes, I'm aware that var is not Variant and I know the differences - but have you seen an old system where EVERYTHING was a variant? Oh my jeebus, save me now). Just because I'm letting the compiler do the work doesn't mean that I should have "good variable names" to help me distinguish what I'm working with.
This the one that sent me over the top... "It removes code noise". *insert sound of game show buzzer* Yeah... um... I'd have to say that's complete non-sense (in my opinion). Over use of the var keyword is going to add code noise and is definitely a smell to me. If I open up a class and see everything as "var" type, I'm going to cringe. The readability of this code has diminished to a point where its now costing me more money to maintain and read the code than it is if I were to use a strongly typed variables when I could. Even the MSDN states:
Overuse of var can make source code less readable for others. It is recommended to use var only when it is necessary, that is, when the variable will be used to store an anonymous type or a collection of anonymous types. [Source]
The last one is almost not worth putting into the post... "it doesn't require a using directive". Wait a minute... I bought ReSharper so I could be more productive... hitting ALT + ENTER to add the directive isn't a bad thing. Who cares if another directive is up there, that's what its used for, to tell the compiler what blocks of code look into during compilation.
This is not a bash on the ReSharper guys, not at all, but an explanation of why I disagree completely with this post. Heck, I agree with most everything on that blog (most of the time) and I'll never speak up about it. I'm a huge ReSharper fan and I will continue to use it and proclaim its greatness, but while reading this post I noticed a smell immediately. While using var has its uses, I think it can be abused.
The saying goes... "When all you have is a hammer, everything looks like a nail." Lets not use var as our hammer. Its a special tool for special cases. We don't use ice axes for steak knives do we (even though that would be very manly and barbaric - and hell, kind of fun), so we shouldn't use var for unintended uses.