Shedding Tiers: Stop Thinking in Terms of Layers
Thinking in terms of layers or tiers really isn't much help to you. Stop doing it: The single responsibility principle and design patterns are all you really need.
Now that the Entity Framework (EF) has replaced almost all of what used to be called the data access layer (DAL) -- code that converted rows into objects and converted changes to objects into SQL statements -- it's odd to find designers still trying to define part of their application as a DAL. Those developers seem to feel their application has to have a DAL and they just need to figure out where it is. I don't think those layers exist. In fact, the more current conceptual tools (design patterns and the single responsibility principle, for example) provide far better guidance about how to divide up your code than the layers paradigm ever did.
History's Dead Hand
Part of the reason layers and tiers exist is historical: In the days of client-server computing, you had to figure out what you were going to install on each physical computer. The idea of tiers integrated two ideas: Classes in a layer communicated among themselves because all the classes in a tier were all on one computer; classes in a layer made restricted calls to classes in another tier because those other classes resided on a different computer. Tiers then got extended into logical layers and kept hanging around as, presumably, a useful design tool. But in an era of interconnected systems (thank you very much, Web Services and the cloud), that paradigm isn't much help any more.
These days, when I'm solving a particular problem, I look for a design pattern that will give me a solution with characteristics I value. The issue of "Into what layer do the classes that make up the pattern fall?" simply doesn't come up. If such a fundamental tool as design patterns ignores the issue of layers or tiers, you have to wonder what layers and tiers bring to the party.
There are other approaches to layers. For instance, in his excellent book, "Code Complete: A Practical Handbook of Software Construction, Second Edition" (Microsoft Press, 2004), Steve McConnell recommends "stratification": levels that give you a consistent view of your application and allow you to dip into other, deeper levels only when needed. The example he uses to demonstrate stratification is, effectively, an implementation of the Façade pattern. If you have the Façade pattern, why do you need stratification?
How We Do Things Now
That's not to say you don't need some categorization scheme. One of the results of applying the single responsibility principle is that you end up with a lot of classes. You need some way to organize your classes so that you can think about them and talk about them with other developers. However, that probably means you need several different categorization schemes, which you apply at different times for different purposes.
No layers, no tears.
Peter Vogel is a principal in PH&V Information Services, specializing in Web development with expertise in SOA, client-side development, and user interface design. Peter tweets about his VSM columns with the hashtag #vogelarticles. His most recent book ("rtfm*") is on writing effective user manuals, and his blog posts on communicating effectively can be found at http://blog.learningtree.com/category/communication-2/.