Having used a whole range of drawing libraries over the years, beginning with Digital Research's GEM VDI, through various versions of the Windows GDI, to Java Graphics and later Java2D (Graphics2D), I can attest to the fact that Cocoa's drawing is modern and powerful. Yet at the same time it has a number of surprises (which perhaps derive, as the Cocoa Drawing Guide puts it, from its legacy in NextStep).
I have no experience at all (yet) of directly driving what purports to be the more modern and powerful drawing library on OS X: Quartz 2D.
On first encountering Cocoa drawing (the usual experiments to orientate oneself), one of the things that immediately struck me was the way in which graphical objects express themselves directly into the drawing context. For instance, instead of something like:
[myGraphicsContext setColor:[NSColor blackColor]];
one writes instead:
[[NSColor blackColor] set];
Similarly, true graphical objects like images and paths can simply be told directly to draw themselves into the current graphics context. In Cocoa the current context lives sort of 'in the ether', you do not get passed a context in your drawing method (or obtain/create one ordinarily), and you can issue drawing commands at any time. To target a specific drawable context (such as an image), you simply send the object that contains the target for drawing a 'lockFocus' message. At this point, all drawing done with any graphical object will go to that context until 'unlockFocus' is sent to the same target.
This is quite an interesting inversion to my previous experiences. Indeed, so was the fact that classes such as NSString have an intrinsic ability to draw themselves.
Another oddity is that Cocoa has a set of objects that do know how to draw themselves (paths, strings, images etc.), but some of the more convenient shapes (like rectangles, ellipses etc.) are not available directly. Instead, for convenient drawing of rectangles one uses a family of NS* (e.g. NSRectFill) functions. Drawing rectangles is arguably one of the most common drawing operations, and it's interesting that rectangles are treated so differently (presumably because they are simple and historically NextStep maintained them as simple structs rather than as true objects).
Recently, I had a need to create a hatch pattern: a sort of barber-pole that I could draw over other graphics as a visual cue. As a Cocoa n00b, I thought "OK, I bet there's a nice pattern brush I can set that will be used during fill operations". Hitting the Cocoa Drawing Guide though, I was surprised to find that things aren't so simple. Well... they are and they're not.
It turns out that OS X drawing doesn't support brushes as such. Reading the various porting guides (for Windows developers and Quickdraw developers) pointed to a feature of Quartz 2D, where you can create handlers for drawing 'patterns' programmatically. You simply register a call back, and your routine will be called to fill in pixels in a 'cell' - part of a tiled pattern being drawn. However, further investigation revealed that this feature is not exposed through Cocoa. Instead, Cocoa has a curious (to me) option of creating a patterned NSColor from an image. I suppose it's not that weird when you consider that pattern brushes are just as 'magical' in their effect as a sort of 'magic ink' concept. Anyway,
this colour can then be set into the graphics context as usual with the -set method, and further drawing operations performed. The downside of this approach, at least in my case, is that you have to start with an image, i.e. you either happen to have one lying around with the right pattern and colours/alpha, or you go to the trouble of making an image programmatically and caching it somewhere. As I only wanted a very simple monochromatic seamless texture, the ability to define an appropriate area and be called back to render a single unit/pixel would have been quite convenient. In the end, I opted to launch Photoshop rather than write the code to programmatically create an image (though you could argue that I'd have had to have written essentially the same code if Quartz 2D's programmatic patterns had been available directly in Cocoa).
Having created my pattern colour I was caught out by another n00b error (wrong assumption). I attempted to draw my hatch pattern (complete with fully transparent pixels) over my existing drawing using NSRectFill. This seemed to work great until I moved the window around and realised that the the transparent bits of the pattern were composited in such a way as to make my window transparent in those sections! I wondered whether the "pattern as a colour" thing was fully supported for arbitrary drawing with alpha, and I tried setting various compositing modes into the graphics context to see if that would make a difference - to no avail. Only later did I revisit the documentation on NSRectFill (which I thought I was familiar with!) to discover that it always uses NSCompositeCopy when drawing. Ah ha! By this time I had created code to render a rectangle into an image, and then drawn this in turn into my view, but I was most satisfied to rip all that out in favour of a simple substitute rectangle draw call: NSRectFillUsingOperation
To wind up this somewhat meandering post, I'm sure that I have many more 'ah ha' moments to experience as I continue to get to grips with Cocoa drawing. However, and despite the various asymmetries I perceive in its design, I am beginning to properly develop a _real_ familiarity and trust in how it all works. I'm finding that I need to keep the Cocoa docs pretty close to hand though - probably a little more than I found I needed to do when learning Java2D.