Flash, specifically ActionScript, was very easy to learn. But got more and more difficult as they added features to a syntax that didn’t really resemble what I was more familiar with. While simple statements were easy to parse, the complex interactions between and within some of the more complex constructs made following the logic more difficult as you moved up scales of complexity. That is to be expected, but instead of a steady climb, there were leaps.
In addition to this, I found the naming conventions needlessly verbose. I understand the reasons for that (predecessors, and Adobe wanting something familiar) but do not agree with the implementation. The reason I do not like it is simply because it is less efficient when you’re reading code to mentally hold “addEventListener” than it would be to simply use “trigger” or simply “event” while you are also taking in what that event you’re listening is doing. As one becomes more familiar the syntax it just blends in, but I find the process inhibited by the conventions used in the first place.
Aside from my opinion on ActionScript, with Apple’s exclusion of Flash from mobile devices and then Microsoft’s subsequent statement that Windows 8 will not support Flash either, it seems obvious that Flash is going to be phased out within 10 years most likely. Adobe even recognizes the need to move on with their latest product to construct web front ends in HTML5.
So, with that writing on the wall, if it was my decision, I would learn HTML5. It is available now. It is in a growth phase and since it is not proprietary everyone (Tech companies) has no reason not to use it. It will be here much longer than Flash is. And sure, if you are going for a quick strike (one time development of a product that will be around only a few year) and you already know the language, there’s no reason not to use Flash. But long term? It’d be more valuable to me to invest my time in learning HTML5.