Dienstag, 13. Oktober 2015

Why did you put this in your job ad?

You might have noticed that I'm currently semi-looking through vacancies. The current project ends at some point  and it will be time to change jobs sooner than later.

This is the sixth time in my life that I am doing this and I am definitely becoming more and more selective. I also tend to spot "dangerous" phrases in job ads. "Dangerous" as in 
  • makes me doubt the quality of the work environment
  • contains a technically invalid or doubtful statement
And I do not mean something like "You consider Lennart Poettering your personal hero" which is probably unique. I might strongly disagree with the requested sentiment, but I do give them credit for drawing quite a good demarcation line with a little humor mixed in. If you don't like Poettering's work (and you really shouldn't), you won't even consider to apply.

What I do mean are statements that are either
  • empty standard phrases copy and pasted from the last job ad.
  • that make you think if what is stated is really what was meant or if the statement was mangled on the way from tech to HR.
  • should make you think what they mean and why you should re-consider applying.
First of all, many, many job ads suffer from a total one-sidedness of information. There is usually a long list of requirements that a potential candidate is expected to fulfill. And than --at the end-- there is either nothing or something like "look forward to a responsible and challenging position in a nice team".

Which is just another nail in the coffin with regard to skilled worker shortage. You know, in a market, when there is a shortage, prices go up and there is marketing. "An nice team" is the absolute minimum I would expect from any employer. You know, I consider working with jerks a criminal offense against my personal well-being. Can you please tell me a little bit about your company, how it is to work at your place? Or at least provide a   link to exactly that information on your website? Because ... well ... I will spend at least 40 hours a week with you, your office and the other girls and guys you hired, I might want to know before if things work out ...

That's about it for the general problem of "informing the potential candidate". On to more specific stuff, shall we?

My favorite for concrete statements is "able to work under pressure". That one is even better in German, where it's "belastbar", (resilient), i.e. can put up with a lot of psychological (and maybe physical) stress.

It's a standard phrase, that a smart HR rep should probably remove from his/her vocabulary, especially when looking at current work environment problems in IT. I also found it quite rare nowadays, which probably means job ad writers are not as stupid as one might think.

The problem with the phrase is, that it can mean, that stress management is completely your own personal problem. Work gets dumped onto you and it is solely your  job to organize it. Deadlines and requirements change and it is still your job to get everything done on time. Expect no sympathies for your personal well-being from your employer.

If you get the impression that the latter interpretation is meant in the job ad: stay away. Those companies burn you out and they probably do not even understand that they have a problem.

In my list of infamous  this is immediately followed by "highly motivated individual"  or --very much worse-- "can-do attitude" . The former almost always means that motivating yourself for a task is exclusively your own problem. To tell the truth, I cannot imagine an environment, where seeing the need to state that they need an especially motivated individual is not a bad sign. Can-do attitude on the other hand is also a dangerous game, because, realistically, some things just cannot be done within allocated time and budget and can-doers often enough simply lack experienced or confidence to hold their ground when assigned impossible tasks. If you think the company wants a yes-man can-doer, stay away.

"ability to work independently" is more ambiguous. It can be interpreted as an euphemism for "we don't have proper organization, so you are on your own, but we still hold you responsible for results". On the other hand, it also might also mean that you are giving plenty of free room to decide on your own how to approach and accomplish your tasks. Remember, however, IT work environments are not what they used to be, so be wary that more independence might actually mean more responsibilities without sufficient resources. Remember, "you want logistics, join the army, Marines make do".

I also like "fast-paced work environment", "be on the pulse of the newest technologies", "things might be moving faster than you expected". This often all mean the same thing: The office is in chaos, everything is constantly changing, there is a lack of direction. It might also be, that work is heavy "startup like", i.e. constantly out on a limp without real financial security and planning. The latter is often indicated by using the word "dynamic" both as in "dynamic individual" or "dynamic workplace".

This might mean a lot of churn, but also often little progress, which can be and usually is exhausting and annoying. And churn but little progress is typical hype-stuff. Currently the cloud is still in the churn phase, producing a lot of different, incompatible technologies but providing little long-term progress. If you are still young and want the heat, go, take it. But if you like a little more thinking and engineering and a little less idiotic hacking: think again before applying.

"Team player" is often seen as a code phrase. Explicitly emphasizing that the team is important might mean you need to constantly put your own needs behind that of the project. In this sense "team player" might actually mean that you receive orders without complaints. As usual: It might also be a copy-and-paste standard phrase, but if you are under the expression that its use is explicit, think again.

And then of course, there's, "target driven", "economic thinker". Your job is to get stuff done with the minimal resources possible. You will have to file a three-page document to get a new pencil because those are expensive.




Montag, 28. September 2015

Sustainable Work Environment

Hopefully, the German myth of the "Fachkräftemangel", the fairy tale of the shortage of skilled workers, engineers and IT specialist has now been buried for good. And I really hope so, since I probably cannot bear another study with random factors (7.14) multiplied onto the real numbers because of gaps in the data.

Note that a very similar thing is happening in the US with exactly the same tricks and exactly the same untrue statements with the same faking of statistics and studies. Actually, no, it's supposed to be even worse in the US and some experts actually claimed it cannot happen as badly over here. Well, they seem to be quite wrong. It is happening. Maybe the situation is not as bad as overseas, but it does happen: Corporations lobbying for cheap skilled work and having their worker training paid for by the government or shifted as a burden onto the worker to do in their free time.

Business as usual: Capitalize results, socialize costs. Liar, liar, pants on fire. So, I claim, and I am not the first one: At least in IT, we do not have a skilled worker shortage.

We have a shortage on sustainable work environments.

There are problems in any work environment, today.

A 2008 study said
  • only 15% of workers are motivated and feel obliged to their company.
  • 64% only work-to-rule, which probably hurts themselves more than their company
  • 24% have quit internally and also do the absolute minimum required to do their jobs.
Jeff Atwood wrote an article about programmer types in 2007, stating that 80% of programmers are the 9-to-5, leave me alone with work after work type that do not really care about their craft. He is quite arrogant about it, but in my opinion he is missing a very important point: If you start out as an 20%er (motivated, truly caring about your job), most development jobs nowadays seem to be such that they make you into an 80%er. It might also simply be a defense strategy to keep sane. When you realize, nobody seems to care but you, you simply stop trying to push the boulder up the mountain.

And if I am to judge, environment in many (read the numbers!) companies are essentially setup to make you into an 80%er (before eventually dropping you): Whenever I open my ears, I almost hear the very same complaint almost from every acquaintance that is also working in IT:

Experienced people, familiar with the technologies that their company is using and/or offering are heavily overtasked. It's usually in the range from 10% (for the lucky ones) up to 40% overload and this overload goes on for weeks and months.

If you are fluent in German, read this. If not, I'll summarize:
  • Almost half of all IT works claim they have too little off time. More than 50% percent claim a lack of work-life-balance.
  • Almost the same percentage claim that their work is not organized in such a way as to avoid and prevent stress.
  • Almost 30% state that if work requirements stay as they are, they clearly won't make it till retirement. An additional 40% think it more likely they cannot sustain their pace until retirement.
Read the last point one more time. Carefully and remember: Results sum up to 100%, there's no overlap:
70% percent of all IT workers consider it more likely than not that they will not be able to do their current jobs until retirement age.
Well, I consider that quite a good definition of unsustainable.

Want more numbers?
If you look at German law, these might be considered illegal work environments.

The Rhein-Ruhr-Institute did a working paper in 2009, listing some of the major problems:
  • constant overtasking. To achieve set goals, workers need to constantly "voluntarily" work long hours.
  • new requirements and change requests are added without adjustment of deadlines and budget restrictions.
  • unplanned tasks are piled on top of planned work without plan adjustment (documentation, status reports).
  • lack of control: possibility for creativeness is limited by change  requests, strict financial restrictions and tight deadlines.
  • project workers are held accountable for results, but have unsufficient resources to achieve goals. There is a conflict between assigned tasks and available resources.
  • limited control over work environment. Selection of suitable tools and  methods is not under the control of the specialized worker.
  • project workers are held accountable for results, but cannot influence priorisation, schedule or work organisation.
  • lack of gratification. Performed work is not appreciated and seems meaningless despite high workload.
  • team support is lacking. Teams change constantly, workers need to compete with each other, teamplay is implicitly discouraged.
  • lack of fairness. bad communication, suggestions to improve are ignored and workers are given the impression that they are foremost expense factors. Combine this with the lack of control over work environments.
  • loss of identification with work and company 
  • conflict between personal work ethics (do quality work) and job reality (meet unrealistic deadlines).
  • permanent pressure to keep up to date without time allocation or support (resulting in trying to keep up to date in free time)
Most strongly affected are project managers and team leaders, because they experience pressure from both sides. Support from the top seems curiously absent, something I would consider a total management failure.

Oh, and wait: Did I read that correctly? Outsourcing is a conceived a threat and creates a highly stressful environment on both sides ...

The studies also strongly relativize earlier studies that work in IT is usually creative, self-determined, and  autonomous. Either the work environment has changed significantly since these initial observations or the situation was bad in the first place.

A DIWA-IT study claims that the work environment has changed significantly for the worse and blames
  • tighter financial investments in IT projects
  • high "replaceability" of workers because of offshoring, economization, and internationalization.
  • loss of influence from workers, "industrialization", market orientation. Workers feel both overtasked and helpless to achieve their goals.
  • social cohesion in team is cancelled by management methods and constant pressure.
  • "system of constant probation", result oriented work, workers are trying to achieve goals they cannot achieve with their resources. Workers that do not achieve results are threatend with the loss of their job ("be more productive or be fired").
Especially the latter is a marvelous strategy: Everything seems to be managed with threats. Perceived economic pressure is pushed through directly to workers: Adapt or perish.

I only recently read an article by some cloud computing advocate. He/she basically said that corporate IT needs to offer services that are on par with off-the-shelf cloud offerings, because those are so much better and easier to use than what corporate IT offers. The threat then is: Give me this or corporate IT will die. Seriously, who are you? If such people are responsible for anything, they should be made personally responsible for any work-related illnesses of their subordinates and customers.
You want to know, why your employees are no longer motivated? Because you are brilliantly killing their motivation and harvest their energy, leaving them drained. You offer them no support and try to manage them with threats.

So what do you offer IT workers? Small scale work with limited scope but big responsibilities. Constant need to do long hours without even seeing the results or with no or only little appreciation for their work.

And when nobody cares about my work, I would try to stop caring about my work, too. Because when there's no appreciation, but only running against a wall, continuing to run against the wall is a recipe for disaster.

My theory is: All of this was working back in the day when the workload was not too high and people where motivated by their tasks, because they had both responsibility for larger systems and the possibility to influence decisions. The churning of "new" technologies was slower (I repeat: new technology does not necessarily mean innovation and new is not neccessarily better in IT) and knowledge was significantly more stable. People kept up with changes mostly in their free time or during work lull times. And now? Lull times are being or have been eliminated. Experts talk about increased work density. This requires more focus on day-to-day tasks during work time. The result is that workers
  • either stop honing their skills and only do their jobs. With the constant churn of incompatible no-progress, their net-worth is reduced simply by the adoption of "new" technologies.
  • or shift their knowledge acquisition into their free time. This causes them to get more likely to burn out or at least fatigue. Combine this with overload and lack of appreciation in their day jobs and the constant threat of outsourcing and the like for added benefit.
Also the complexity of some of the modern systems make it next to impossible to test them out with the resources available to a single person. You cannot learn to administer a network cluster with a few machines alone even if you use virtual machines. I do not have access to enterprise grade network equipment to fiddle with. I lack a few cloud servers to play with puppet, vagrant, and all the other nice tools. However, people still try and try and they feel helpless while doing so.

This theory is backed by the DIWA-IT working paper: IT workers shut off their own, health-related early warning system because of a loss of inner (identity) coherence. Most of them probably really liked their profession in the first place and wanted to get something done. But getting things done has become harder and harder with the (often needlessly) increased complexity of the IT landscape. Add constant churn (useless change), global competition and greatly reduced rewards and gratifications and you have a disaster recipe.

Because of the constant threats, workers compromise and these compromises more often than not are in conflict with personal vision and work ethics. So they not only compromise their personal values but are also forced to spend more and more time doing work they feel is insufficient to simply retain their livelihood because they experience a very real existential threat (loss of job, unemployability).

It is, in essence, a burnout mill, where you either go numb or fall of the cliff. And it does not get any better: The cited studies and working papers are from 2005, 2008, and 2011.


Sonntag, 6. September 2015

You broke my regular expressions

A quick one to get you to re-think some of the regular expressions you wrote:

What part of

    AAAA

does

    (AA|AAAA)

match?

If you thought AAAA, welcome to POSIX (and C, C++, boost) country, where regular expressions return the longest leftmost match over the whole expression.

If you thought AA, welcome to Perl, Python, Ruby, PCRE, Java, and the like, where alternates are matched left-to-right and only the first one matches.

The latter is called eager alternative matching, but I would rather like to call it short-circuiting. It seems to simplify matching of alternatives and reduce backtracking (and thus memory requirements), or in short: it's faster. It can also be annoying if the engine does not allow you to switch between the different interpretations, because sometimes you really want the real leftmost longest match. And none of the current engines allow that switch.

It's not only alternates, however: Matching A(A)?(AB)? against AAB matches AA instead of AAB with modern modern engines. Left longest match? Nope. Instead, the quantifiers are greedy even if the longest match rules is violated. You need to re-write the above as A(A)??(AB)? (note the double quotation marks) to get POSIX-like behavior.

That greedy matching modifiers even exists is an artifact of dropping leftmost longest semantics. Modern regular expression engines do not perform longest leftmost overall matching. If they claim otherwise, the documentation author is clueless.

So, some time in last 20 years your regular expressions slowly but surely has their semantics changed for the sake of more features and more speed. And it still leads to confusion.

What do we learn from this? Well ... this what happens when you --more or less-- silently change something that was accepted as the norm for quite a long time. It is really quite important to document such subtle but fundamental changes loudly, prominently, and clearly.

And yes, I want my leftmost longest match back at least as an option. Python's regex now has that option since version 2015.09.15. Nice to have and thank you Matthew.

Samstag, 5. September 2015

Rules for Explainers

I've been dawdling in programming forums for quite a while now and something really strikes me again and again: Some of the smartest girls and guys out there are really bad at explaining and teaching.

  1. Solving a problem is different from helping someone solve a problem.

    You will find quite a few answers to technical questions, where someone simply drops a correct solution that has nothing to do with the original post.

    Often enough they do not even bother to explain what the original poster did wrong. They just solve the problem and leave. There, I am much smarter than you, I solved your problem. Now go away.

    Don't do this. When you answer someone's question, you are not only a problem solver, you are a teacher. In that role, you are supposed to help the other learn something. And reverse engineering a working solution is not a very effective learning method, because it quite often does not  yield any insight on the underlying mechanics.

    It is the old feed him for a day-problem: The goal is to give the "student" enough information to help himself/herself the next time round. The oh-so-common howto-style of documentation is exactly the opposite: stray from the path of the howto and you get lost.

    Somebody without a deep understanding of a problem is very likely to  incorrectly reason about why your solution works and his/hers does not.
    So, when you find out somebody has a non-solution to a problem, try to at least a little thing about out how she/he actually arrived at the wrong solution. Empathy is not optional, here.
     
  2. Don't explain with your goggles on.

    Recently, somebody told me, alternate expressions of a regex are matched left to right, matching the first, ignoring the leftmost longest rule.

    That's correct. For Python. Almost everywhere else (POSIX, Ruby, ...), this is wrong, and so actually a really hard gotcha.

    Regular expressions traditionally applied the leftmost longest match rule over the whole match. Python and PCRE regular expressions are different, they short-circuit the first matching alternate, if a subsequent match could have been longer, it is ignored.

    The problem is, that the poster did state it as an overall fact, not bothering to explain that this is actually a very specific concept. He/she was using his Python goggles to look at the problem, not bothering or not being able to take the broader view. This usually gets across either as  arrogant or ignorant. It gets worse when you basically drop this as a brief comment on someone and then leave. Or in short: It's just rude.

    To be fair, in the described situation, the poster and I discussed this and came around to reconcile our view of the situation. This pushed up my respect for him quite a notch. Confrontation is only bad when there's no progress from it.
  3. Don't assume

    One of my favorite responses to one of my postings starts with "If you believe that ...". Everything that follows is sarcastic condescension, very thinly veiled. And when called out, the responder simply re-asserted his position without bothering to recheck his facts. Order No. 227, anyone? Three mistakes in one go: Assuming that I meant what he/she understood (I clearly didn't), immediately thinking that his/her opinion/intellect is superior to mine (I cannot comment on that), and non-reconsidering his/her position when I stood my ground (possibly because of arrogance).

    It happens a lot with nerds, especially younger ones. Hell, I think most of us are like this before age teaches us that we are not the smartest person on the planet. But for some the humbling down never happens. It's also kind of a self-reinforcing mechanism: Most people just walk away right after the first time you are rude at them and they never come back. This looks like a victory ("I was right"), but it is actually downright failure. The goal wasn't to be right, but to convince others. People walking away from you are not persuaded, they are disgusted. You see a pattern: Yep, again, assumption won out and re-assessment did not happen. You only think you are always right, because you never re-evaluate. Congratulations.

    Walking through life always totally assured that your interpretation of the world is correct gives you a lot of momentum and can take you far. It also means that it hurts all the more whenever you hit that waiting brick wall.

    Allow yourself the doubt, it makes you a better person and a better teacher. And give others the benefit of doubt.
To be continued...

Samstag, 8. August 2015

Let's, please, not C++

Let's face it, there are basically two "favourite" languages for doing performance-oriented development: C and C++.

I stopped having troubles with C years ago. It was my second (programming) language (after Pascal), I still do the occasional development in it. It's dirty, but not too dirty and --I think-- C has a very good reason to be dirty: Tools that are used to do dirty work tend to get dirty themselves. Show me a masonry chisel without mortar dust and I show a a masonry chisel that isn't used.

C++, on the other hand, is a different monster. I once read about C++ that it includes almost every language feature known to man, but always only the most simple implementation of that feature. I do not agree with that. C++ includes many features, but what it implements might by called the internally most lean version of a feature. That is not the same as "simple" and it is certainly not the same as "easy to use".

The problem seems to me, that those seemingly simple (i.e. internally lean) implementations are usually leaking badly and the interaction with other language features typically adds layers of complexity that can slow you down to a crawl. I call this "pile design": "Just dump everything over there". You can make C++ do almost everything, but need a lot of lubricant (both in terms of time and code) to make C++' cogs turn properly. And even if the machine seemingly runs alright, there is always another corner case that you did not think of.

Whenever you want to do so something "right" in C++, you also seem to immediately need a serious pile of workaround engineering. By this, I mean every piece of questionable code that needs to be written to make the core code behave properly according to user expectations.

I do judge a language by the amount of time I spend making my code adhere to the concepts of the language, which includes --for example-- the behaviour patterns expected from standard library functionality. And in C++ this amount of time is very much unproportionally large. And it usually means, that either the language or the standard library (or both) are misdesigned.

Well, yes, C++ is misdesigned. Its language designers seem to constantly make a specific mistake: They mistake internal conceptual leanness with external conceptual leanness. Internal conceptual leanness is achieved when --inside your own code-- you try to use as few different concepts as possible. This --often enough-- also includes writing as little code as possible, but --most importantly-- it means that somebody reading your code does not have to read the full language design part of the computer library to understand it. External conceptual leanness means that you expose as little complexity as possible.

So internal leanness is a combination of
  • fast (performance wise) algorithms
  • efficiency/simplicity for the tool/language creator
  • limited number of internal concepts
  • and so on
External leanness is a combination of
  • Easy access to algorithms
  • simple APIs without over-complex dependencies
  • limited number of user-visible concepts
The goal in every design (language or software) is to have as much possible external conceptual leanness without sacrificing features and keeping internal conceptual complexity low enough to be manageable. The trick (and the glory) in the design is to manage internal and external leanness which are quite often at odds.

From my personal experience:  Designs by committee are never externally lean.

And C++ fails almost spectacularly to be externally lean. In many parts of the language itself and of the standard library, it tries very hard to be internally lean (down to really, really bad variable names, which is the dark side of internal leanness), for example by exposing internals of library functionality that should better be wrapped (to make it externally lean).

Let me give you an example: You want an indexable collection that mimics an STL container but allows listening to modifications, e.g. when you call insert(), you want notify_before_add() (or similar) to be called.

If you are a C++ programmer, you might already have guessed, that the above collection is quite tricky to get right. First of all, a proper STL container need a real load of annoying boilerplate code to work properly. Which is bad in itself, but given the functionality provided it might be acceptable. It still is a failure towards external leanness (it is actually does for internal leanness, because it keeps code duplication in the STL low).
However, remember that some (those that return at least non-const forward iterators) of the STL containers pass back references which you can modify in place. Nicely enough, you cannot test for forward iterators.

std::set's iterator type changes from bidirectional to constant bidirectional in C++11.  The documentation goes to great lengths telling you that you cannot modify an element inside a set, but downgrading the iterator to its non-mutable type was probably simply overlooked in earlier versions. I can imagine, tracking all of the exceptions to your own rules can get you lost.

For example,

    std::vector<int> v;
    v.push_back(1000);
    *(v.begin()) = 2000;

is a perfectly valid operation to modify the first element of an std::vector.
So when we want to be notified of that change, (interpreting it as either a remove followed by an add (or an actual replace, if you have a notification for that), you need to wrap the iterators. All four of them, because you cannot make assumptions.

This is not, too bad (you need to do so in Java, too). My point in language misdesign is, however, the externalizing of the lvalue reference to the internal object. Most of the sequential STL collections allows you to in-place modify the object inside the collection, which creates a real load of complexity should you ever try to wrap a collection.

Side note: Fortunately, you cannot modify std::map's keys in place, that would be really awkward. So this

    std::map<int, std::string> m;
    std::get<0>(*m.begin()) = 1;

does not compile.

Let me show you the problem with operator[]. Every language I have seen so far gets the fact right, that --when you allow to overload the index operator-- you really need to provide two operations: get and set. C# does it (this[]), Python does it (__setitem__(), __getitem__()), Ruby ([] and []=) and so on.

Guess which language decided that a[i] = b can be decomposed into two operations, namely "auto temp = a[i]; temp = b;" and hence, you only need the accessor part of [], because that is so much simpler (internally)? 

Yeah, of course.
If you are a Python and/or C# developer, you might be inclined to protest that

a[i] = x;

and


value_type &temp = a[i];
temp = x;

are two very different things. Yes, in your (and in my) world  that makes perfect sense.
But this is C++ and C++ often seems to make a point to be different. Different as in not externally lean. In C++,

a[i] = x;

means "copy the contents of x into the object at the ith position inside a". Since the object (i.e. memory) at a[i] does not move, from C++' perspective, the collection's contents have not been modified. So in C++, unlike everywhere else I know of, a[i] = x is not considered a modification to the container.
I think it is those little annoyances of being different for the sake of internal leanness that makes C++ suck so often.
Hence in C++, if you want the notifying collection, you need some kind of descriptor to act as the returned reference:
template<typename C, typename Ref=typename C::reference>
class Descriptor {
public:
private:
  C &_collection;
  reference _ref;

public:
   Descriptor(C &collection, Ref ref)
     : _collection(collection), _ref(ref)
   {
   }

   template<typename T>

   operator=(T value)
   {
      _collection.notify_before_remove(_ref);
      _ref = value;
      _collection.notify_after_added(value);     
      return *this;
   }

    operator Ref()
    {
       return _ref;
    }
};

template<C=typename std::vector<T>>
class Array {
public:
   typedef typename C::value_type value_type;
  typedef Descriptor<Array<C>, typename C::reference> reference;
   // and quite a few more

private:
   C _items;

public:
   Descriptor<Array<C>, typename C::reference>>
   operator[](size_t n)
   {
      return Descriptor<Array<C>, typename C::reference>>(*this, _items[n]);
   }
}

Of course, we are going against C++ implied semantics here. For the standary library containers, everything you stuff in there is just a memory location that it manages. It does not care what you do with that memory, as long as you provide the standard library with sufficient tools (copy and move constructors, copy-assignment and move-assigment operators) to shuffle around the contents of the managed memory to a new location if needed.

<functional> offers reference_wrapper<> which also covers callables. As usual: Caveat emptor for the overrider since nothing is virtual.

Mittwoch, 8. Juli 2015

For Loop Scoping is Evil

Try the following in Java 7:

import java.util.List;
import java.util.ArrayList;

public class Test {
    private final List<Object> a = new ArrayList<>();

    public void doSomething()
    {
        for (Object a: a) {
        }
    }
}

As you probably expected, this does not compile, because you cannot iterate over an Object (and Object a should shadow the private field):

  error: for-each not applicable to expression type
        for (Object a: a) {
                       ^
  required: array or java.lang.Iterable
  found:    Object
However, if you consult the JLS, this behaviour is actually wrong:

The enhanced for statement is equivalent to a basic for statement of the form:
for (I #i = Expression.iterator(); #i.hasNext(); ) {
    VariableModifiersopt TargetType Identifier =
        (TargetType) #i.next();
    Statement
}
That is, according to the specification, Object a is meant to be declared inside the for loop's block after the outer a (Expression) has already been evaluated.

Java 8 corrects this mismatch between specification and implementation and the code above compiles fine.

The effects however, are probably counter-intuitive. The enhanced for loop switches around the declaration of the variable as it appears in the code (Object a) and the point in the program where the declaration takes effect.