Wednesday, June 27, 2012

Tools are the bane of the Beginner

"Huh? Isn't that counter-intuitive?", you ask?
I realizing something particular as a result of some experiences and wrote that line down, but to me even it sounds counter-intuitive; so let me hasten to explain.

A beginner, for the purposes of this discussion, is somebody who's beginning something. This could be a novice starting to learn a skill; but it could also be an expert who's beginning a new project within his area of expertise.

A a beginner (thus defined) is hindered by the presence of tools and frameworks for two reasons: 
  • They obscure the what and the how of the problem at hand by hiding it within ( usually for novice beginners).
  • They prevent easy exploration of the why of the problem and its solution space through ceremony and preventing access(usually for experienced beginners).
Allow me to present the experiences that led to this realization. It all began one mundane work day in the not-too-distant past....


Story#1

My team had just got a bunch of freshers. They were recent engineering graduates (presumably with some exposure to programming) who have passed through the company's training program (which again presumably imparts further such exposure). We, however, found that they couldn't do some simple tasks like write code outside Eclipse. They didn't know how to deploy a web application except through the Eclipse plugin; had never debugged an application via logs and in fact didn't know  about webapps as an idea independent from "Tomcat". Their OO concepts were shaky at best, but they had implemented small-but-complete web applications using Tomcat, Struts and Hibernate and passed an EJB exam. When asked to build their study app the same from scratch using a command line, however, they were lost. When asked to build a different application (than the one they'd done) *using Eclipse*, they were similarly lost.

While a large portion of the blame should rightly lie in the teaching methodology (or lack of one), the tools and frameworks too, IMO, should bear some of it. "Deploying" for them meant clicking on the Tomcat icon in Eclipse, so they had no use of knowing what "web.xml" did nor did they know that it was no longer required. The same wizards and menu options that make the life of a practitioner easy actually obscure the underlying process (and why its required) to a beginner.


Story#2

The same group of freshers were slowly getting on track with (re)learning the basics of programming, when I thought it might be a good idea to instill in them at this "early age" the values of Test Driven Development. I immediately checked myself, however, because they'd have to learn JUnit and how to use it. On second thoughts, however, I realized that they didn't HAVE to use Junit or any such framework to do TDD. All they had to do was write a test before writing the actual code, have it fail, write the code and have it pass the test. The test could be code in the same function or in main() or as a series of calls to the program stored as a shell script or as JUnits. All of them are equally valid as "tests". We generally, however, recognize only the last of these as tests. The concept of TDD has been usurped by the concepts of the tools that implement it - to the extent that TDD doesn't seem have a life outside of those tools.

Tools, therefore, seem to be actively scuttling the consumption and adoption of Concepts, even when they were created explicitly for the reason of automating the repeated application of known concepts.

I personally have been struggling with this - the guilt of not doing TDD vs the allure of just seeing working code - especially when I'm beginning something and still feeling the problem and solution spaces out. In some cases, the solution space doesnt have readily available tools (BDD on the browser, anyone?) and in others there are tools but I'm still not ready to commit to them because I dont know what my solution is yet (should I build the parser first to figure out the syntax or the AST interpreter to see how it would run?). My liberation came when I declared that a test will be whatever I call a test for the situation at hand, not what some framework determines to be one. Since then the test-red-code-green-repeat cycle is a much more doable one.

Full Circle

Back to the initial "Huh?" moment from the beginning. Why then do we generally consider tools to be useful - especially for beginners? Tools are generally time-savers. They do one thing and they do it well; and that is their value. They do, however, have an "operating range" in which they're most useful. Below that, they're overkill and above, they're obstructive - as depicted in this highly accurate graph on problem size vs tool effectiveness:


So when we usually talk about tools being useful, we're talking about the useful operating range. Specifically for beginners, tools are solution accelerators at that range. The stories presented here represent the two ends of the spectrum, however, where tools are sub-optimal.

Note: I've glossed over frameworks in this discussion, but the concept is the same; or applicable even more so for frameworks. Frameworks by definition are a standard solution to a common problem, with room for customization so that application specifics can still be implemented. The framework is one because it has a known world view and exposes an interface that allows operations on that world view. The concept of operating range is well-ingrained, therefore; as are those of the limits on either side.So please read "tools/frameworks" wherever you see "tools" below.

So...

Armed with this framework for evaluating tools, we can start asking some interesting questions.
  1. What is a good tool?
  2. When are tools not required?
  3. When are tools required?
  4. How do we determine the operating range of a tool, then? 
  5. What can we do to use tools more effectively?
  6. What can tool builders do to make effective tools
Attempts at answers to these questions in part 2 of this article.

Sunday, June 10, 2012

What if there IS no source code?

...yeah, kinda linkbait-y heading, I know.

What I mean is: What if there is no single source of the code?

Let me explain.

Typically we have source code. Its written by someone, stuck in a source control system somewhere, changed by others and so on.

The projectional editing school of thought modifies that picture somewhat by suggesting that we could have different views of the same code - a functional/domain-specific one, a folded one, a UML(ish) one, a running trace one and so forth. The relationship between the source and the multiple views, however, remains decidedly one-to-many.

What if instead of this master-slave relation, the different views themselves were the source? That is, the "whole picture" is distributed across the views - like a peer network?

Assuming the views are consistent with each other, modifying the source in one view should retain the true intent. But is that possible even? Views are, by definition, projections of the code; meaning some parts are included in the view and some aren't. So how would we maintain consistence across multiple views?

Two paths lead from here:
One: We cannot. This is why we need the "one true source" that's the parent of all views.
Two: Maybe we don't need consistency all the time. Maybe we can do with the "eventual consistency" that big data/nosql guys are raving about?

#2 seems like an interesting rabbit hole to explore :)

Thursday, June 07, 2012

Naked Objects for Mobile

Why not, I ask?

The Naked Objects Pattern has been an idea that I've always found interesting and useful for quick-n-dirty apps.

The Apache Isis way of NO seems a little bit overwrought (as most of Apache projects are), but I've been following JMatter for quite a while now and have been impressed despite it being a Swing-only UI.

For the simple one-user app, the NO pattern seems sufficient: the thing gets built quickly, there's lowered expectation of customized UI from the user and probably even platform portability.