TITLE: Using Null Object to Get Around Truthiness AUTHOR: Eugene Wallingford DATE: November 14, 2011 3:43 PM DESC: ----- BODY: Avdi Grimm recently posted a short but thorough introduction to a common problem encountered by Ruby programmers: nullifying an action when the required object is not present. In a mixed-paradigm language, this becomes an issue when we use if-statements to guard the action. Ruby, like many languages, treats a distinguished value as false and all other values as true. Thus, unless an object is "falsey", it will be seen as true by the if-statement. This requires us to do extra tests or wrap our objects to ensure that tests pass and fail at the appropriate times. Grimm explains the problem and walks us through several approaches to solving this problem in Ruby. I recommend you read it. Some of my functional programming friends teased me about this article, along the lines of a pithy tweet:
Rubyists have discovered Maybe/Option.
Maybe is a wrapper type in Haskell that allows programmers to indicate that an expected value may not be available; Option is the Scala analog. Wrapper types are the natural way to handle the problem of falsey-ness in functional programming, especially in statically-typed languages. (Clojure, a popular dynamically-typed functional language in the Lisp tradition, doesn't make use of this idea.) When I write Haskell code, as rare as that is, I use Maybe as a natural part of typing my programs. As a functional programmer more generally, I see its great value in writing compact code. The alternative in Scheme is to us if expressions to switch on value, separating null (usually) base cases from inductive cases. However, when I work as an OO programmer, I don't miss Maybe or even falsey objects more generally. Indeed, I think the best advice in Grimm's article comes in his conclusion:
If we're trying to coerce a homemade object into acting falsey, we may be chasing a vain ideal. With a little thought, it is almost always possible to transform code from typecasing conditionals to duck-typed polymorphic method calls. All we have to do is remember to represent the special cases as objects in their own right, whether that be a Null Object or something else.
My pithy tweet-like conclusion is this: If you are checking for truthiness, you're doing OO wrong. To do it right, use the Null Object pattern. Ruby is an OO language, but it gives us great freedom to write code in other styles as well. Many programmers in the Ruby community, mostly those without prior OO experience, miss out on the beauty and compactness one can achieve using standard OO patterns. I see a lot of articles on the web about the Null Object pattern, but most of them involve extending the class NilClass or its lone instance, nil. That is fine if you are trying to add generic null object behavior to a system, often in service of truthiness and falsey-ness. A better approach in most contexts is to implement an object that behaves like a "nothing" in your application. If you are writing a program that consist of users, create an unassigned user. If you are creating a logging facility and need the ability for a service not to use a log, create a null log. If you are creating an MVC application and need a read-only controller, create a null controller. In some applications, the idea of the null object disappears into another primitive object. When we implement a binary tree, we create an object that has references to two other tree objects. If all tree nodes behave similarly except that some do not have children, then we can create a NullTree object to serve as the values of the instance variables in the actual leaves. If leaf nodes behave differently than interior nodes, then we can create a Leaf object to serve as the values of the interior nodes' instance variables. Leaf subsumes any need for a null object. One of the consequences of using the Null Object pattern is the elimination of if statements that switch on the type of object present. Such if statements are a form of ad hoc polymorphism. Programmers using if statements while trying to write OO code should not be surprised that their lives are more difficult than they need be. The problem isn't with OOP; it's with not taking advantage of dynamic polymorphism, one of the essential benefits of OOP. If you would like to learn more about the Null Object pattern, I suggest you read its two canonical write-ups: If you are struggling with making the jump to OOP, away from typecasing switch statements and explicit loops over data structures, take up the programming challenge writing increasingly larger programs with no if statements and no for statements. Sometimes, you have to go to extremes before you feel comfortable in the middle. -----