Month: August 2012

Fall 2012 trends!

The September issue of InStyle landed in my mail box a couple of weeks ago, and it’s a honking door-stop of a magazine. It weighs almost as much as my toddler-going-on-teenager eating machine. Summer is a bit of a snooze-fest, fashion-wise, so I am always excited about the first fall issues to come out; they’re guaranteed to be chock-a-block full of new trends to try on, pass on, or snark on. Shall we get to the good stuff?

So, according to InStyle, here are the fall 2012 trends coming to a mall near you … soon-ish.

Hunter green

The name makes me think of that ugly minivan colour that was ubiquitous in the 90s.

Functional, maybe. Wearable … not so much.

Not a colour I have any desire to wear, nor an easy one to accessorize. However, most of the photos in InStyle show a much more wearable colour, which I would describe as more of a muted jade than a hunter green. Generally, unless the season’s “it” colour is one you like anyway, I don’t recommend jumping on this kind of trend. Chances are, the colour will not be a good fit with the rest of your wardrobe and its overall colour scheme. Also, it’s very likely that you will be able to count on one hand (or one digit) the number of acquaintances who know what the “it” colour is in any given season, so this trend will probably not give you the big “ooh-you’re-so-stylish” bang for your buck.

Ankle-strap heels

I love the shoes featured in InStyle under this heading, but this style can be tricky. Too thick a strap, especially on too-short a person, can be unflattering. It’s always better to look for a thin strap, or a “nude” coloured strap, and pair it with a slightly higher hem line. Definitely nothing below the knee, unless you have mile-long legs.

I wouldn’t consider ankle-strap shoes a wardrobe staple (here are a few ideas in that direction), so I won’t be looking too hard for a pair; if a nice, affordable one crosses my path, I won’t say no, though.

Military details

I’ve tried the whole military thing before, and it wasn’t my cup of tea. Some trends are just too far removed from my personal style to make me feel “authentic”, and this is one of them. I always feel like I’m either trying too hard, or looking like a high school marching band reject. Or both.

If you have an edgier style, and are a big fan of khaki and/or camo, this look may be right up your alley.

Trumpet hems

Now, this is one fall 2012 trend I can wholly embrace. I love pencil skirts, and a trumpet hem just ups the fun factor. They are almost universally flattering, provided the length stays at or above the knee. As an additional bonus, this is a “trend” that never really goes out of style so (a) you will likely have no trouble finding it in most stores; and (b) you can wear it year after year without fear of looking dated after six months.

Anthropologie Dotty Trumpet skirt ($149)

Dome bags

Trendy bags are a concept whose purpose and utility escapes me. Unless you are the kind of person who buys inexpensive, disposable bags with some frequency, a bag tends to be one of the bigger “investments” you are likely to make in your wardrobe. Accordingly, it’s probably a good idea to buy a bag that’s classic and useful. In my books, dome bags are neither. The style is a perversion of the bowling bag, rendered even more inconvenient by squashing the sides together.

Brocade

Brocade is a really fun fabric – so lush, and decadent, and luxurious, y’know? It’s also the perfect fabric for a trendy item, because it looks expensive even when it isn’t. And it makes quite a statement …

Marie Antoinette, style icon

… well, not necessarily this much of a statement. Although I’m definitely down with “stylish lounging”, were that a legitimate sport or something. How does one go about petitioning the Olympic Committee?

Shift dress

I like shift dresses. They tend to be forgiving, and work great with leggings – perfect for the cooler fall weather. I’m not sure this is much of a trend, though, because shifts are a pretty timeless style. It is pretty versatile though; you can wear them “as is” for a more casual vibe, or add a belt for a more sophisticated, body-skimming look. And keep the length relatively short; because this is a more voluminous style, you don’t want to go too long and potato sack-looking. The leggings come in handy to keep things PG-13.

Voluminous coats

Ok, shift dresses are one thing … big, sack-like coats are another. I have to draw the line somewhere. I’m not sure if this trend is flattering on anyone, unless we’re talking about 7 foot tall, willowy Amazons. And even then, I have my reservations. A coat, like a bag, is one of those items where the costs dictates that you go classic. Ideally, something both flattering and functional. Voluminous coats might be functional – after all, there’s enough room in there for a half dozen sweaters to keep you warm in the deepest of deep freezes – but that’s about it.

Patent leather

Here, I feel like InStyle is reaching. I mean, does patent leather sometime go out of style? I don’t think so. A pair of black patent leather pumps is, after all, a shoe collection staple. If you don’t already have one in yours, consider this your chance to rectify the oversight. I promise, you won’t regret it. Which is more than you can say about most trends.

What are your favourite Fall 2012 trends?

 

 

 

Friday Flashback: Google brain

This week’s post, entitled “Do the Evolution, Baby”, was originally written in June 2008:

I come from a tradition of Western culture in which the ideal (my ideal) was the complex, dense and “cathedral-like” structure of the highly educated and articulate personality — a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West.
But today, I see within us all (myself included) the replacement of complex inner density with a new kind of self-evolving under the pressure of information overload and the technology of the “instantly available”. A new self that needs to contain less and less of an inner repertory of dense cultural inheritance — as we all become “pancake people” — spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.
Will this produce a new kind of enlightenment or “super-consciousness”? Sometimes I am seduced by those proclaiming so — and sometimes I shrink back in horror at a world that seems to have lost the thick and multi-textured density of deeply evolved personality.
– Richard Foreman

In a recent Atlantic Monthly article, Nicholas Carr posed the following provocative question: is Google making us stupid? [Ed. note: the article appeared in the July/August 2008 issue.] To roughly paraphrase the gist of Carr’s article, the ever-increasing use of and reliance upon the Internet as one’s primary source of information has the capacity to change the way we think. Carr elegantly puts it thus:
As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski. 
I understand Carr’s scepticism (ambivalence? regret?). It is akin to the feeling I have when I consider the possibility that books might one day be obsolete, and realize that the deep sentimental attachment that feeds my regret is an anachronism in this day and age. I suppose this is a natural phenomenon of living in an era of radical technological progress. The way we live, think, interact with one another can change so quickly nowadays that we can easily fall out of step if we stop, even for a moment, to reflect on what might be lost through our evolution. I’m not sure that humanity ever before faced such a fast-paced barrage of change – or embraced it quite as unquestioningly.

A century ago, people might reasonably expect to encounter no more than one significant technological advancement, and deal with its social and cultural consequences, in their lifetime. Now, the pace of technology is such that people can speak meaningfully about “generation gaps” – generations being sometimes less than a decade apart. I don’t think most of us have fully grasped the uniqueness of our present position. And, by many accounts, we are for now merely on the cusp of the actual technological ‘boom’.

For these reasons, I can’t speak unequivocally about my own feelings or expectations in the face of the coming change. I have no precedent by which to be guided, and the future – whatever it may be – is guaranteed to be so radically different from the here-and-now that it bears no scrutiny. All I can say is that I wish people were a little more conscious of the subtle changes being wrought in every aspect of their daily lives, and perhaps a little more circumspect in each step they take towards the brave new world.

We create technology, and technology changes us, influencing everything including the way and what we create. Where is the beginning and where is the end, and where does the balance of control rest in this equation? Trying to unravel this particular conundrum is a little like trying to answer that eternal question – which came first, the chicken or the egg? Are we still, at this point, functioning within some natural framework – of evolution, God’s design, what have you – or are we now beyond any such ‘simplistic’ conceptions? If progress is good, have we thought about what it is exactly that we are progressing to? Who is, and should be, the “we” in question? Who is ultimately making the decision, and what’s being done to ensure that it’s an informed one? Of course, I’m over-simplifying things to the point of absurdity for the sake of my argument. It’s silly to think that humanity might strike up something like an ad hoc committee to decide its own fate. Even if it were logically – and logistically – possible, it would probably immediately degenerate into farce; its only fruit most likely nothing more than a useless report, 5 years and a billion dollars too late.

Yet, it strikes me that the goal or destination of our march of progress is an important question. Shouldn’t we have some idea, some general sense of direction in such matters? Shouldn’t we at least give some thought to the plausible outcomes, and arrive at something more solid than a mere assumption of their suitability? Merely plunging ahead, which seems to be our current modus operandi, seems fraught with possibilities for regret later on. While it isn’t exactly fashionable to question the Google that ‘feeds’ us, perhaps it might be wise.

August 10, 2012: I originally wrote this post a year or two after I signed up for a Facebook account. I had been one of the first people in my circle of acquaintances to do so, and I remember vividly how novel of an experience it seemed to be then, and even a year later. Four years after that post, I am hard-pressed to remember a version of my social life that didn’t revolve around Facebook. I can count on one hand the number of friends with whom I don’t interact – in many cases primarily or even exclusively – through Facebook. In fact, it would not be a hyperbole to state that the majority of my social interactions, outside of immediate family and work, occur through or are facilitated by Facebook. I also now have a Twitter account, a LinkedIn profile, and have been pressed on more than one occasion to sign up for Pinterest. And, even so, I remain one of the “old school” bunch. I prefer to read my books in paper form, and I don’t have an apps-loaded iPhone to navigate everything from shopping to public transit. Slowly but surely, the social hub of the entire plugged-in world is moving online. It might be taking old timers like me a little longer to fully assimilate, but to my son’s generation, it looks like this will be the only known mode of existence. Perhaps that is why I always feel a little bit of joy whenever I see my son turn his attention away from his determined pursuit of the laptop to a book. Of course, he can’t read yet. But I hope that the physical experience stays with him, acting as a counterweight to all the intangible technology that surrounds him.

One of the questions that jumped out at me on re-reading the post was this: “…what it is exactly that we are progressing to?” Having read bits and pieces on the philosophy of transhumanism, that question continues to fascinate and frighten me; most of us have given so little thought to our species’ future (separate and apart from that of the planet), even though it is likely to be such an unrecognizable one. Will humanity cease to exist at some point – not because of some global catastrophe, but because we will cease to be human? A blurring of the line between human and artificial intelligence has long captivated popular imagination, while at the same time proving repellant to our sense of (human) identity. But is that identity going to slowly become as much of an anachronism as a geographically-fixed sense of community?

Mind your mind

I was recently talking to an acquaintance, and had occasion to compliment her on what she was wearing, which I proceeded to do by telling her that her look was very Breakfast at Tiffany-esque. A second later, I panicked at the thought of potentially having put my foot in my mouth (which happens on a regular basis); I quickly added that my intention was to compare her to Audrey Hepburn, not a prostitute. She looked at me like I was from Mars, but politely thanked me anyway.

Foot, meet mouth.

As I said, this sort of thing is a regular occurrence for me. My unfortunate tendency to blurt out things is matched only by an irrepressible tendency to overthink things. A compulsion to store seemingly limitless amounts of useless information means that my brain has plenty of fodder at its disposal when analyzing the possible implications of a casual remark.

[For anyone who remains puzzled by my Breakfast at Tiffany comment, the original (literary) version of Audrey Hepburn’s character was, to put it gently, a high-class escort and unrepentant gold-digger. In the book, she does end up running off to Brazil, leaving the George Peppard character in the dust. Speaking of which, gratuitous photo time:
Fans self. OK, let’s move on.]

Pop culture dissertation aside, thinking about my shortcomings in this area reminded me of a guest post I read recently on one of the personal finance blogs I follow regularly. [This one.] The author proposed the following question as a means of recognizing what forms the substance of one’s sense of identity (which might be composed of any number of smaller or bigger elements):

What mind do you bring to a problem?

Let’s say you identify yourself in terms of being a teacher, a runner, a parent, a volunteer firefighter. When it’s time to tackle a problem, which of those traits or characteristics most informs your decision-making process?
Take me, for example. There are many ways in which I would normally describe myself, but only one that fully captures the essence of my personality, as judged by the “kind” of mind I bring to my everyday life. And, no, it isn’t the mind of a flaky trivia junkie. Generally speaking, whenever I am confronted with a decision (in its broadest terms), I typically have an instinctive reaction – a sense, if you will, of what constitutes the “right” answer. That instinctive reaction is almost always immediately countered by the reflex to defer action until I’ve had a chance to carefully analyze all available data. Essentially, the decision-making process is one by which I exhaustively check all relevant points to make sure that my first instinct is right. I am open to the possibility that it is not, but I’m not necessarily going to be swayed by a negative that isn’t written in stone. I just need to know whether my gut instinct can be reasonably justified. All in all, I would call this a lawyer’s mind.

Lawyering is often misunderstood. In the public eye, lawyering is often equated with being a glorified mouthpiece, a pit-bull-for-hire, a  … well, you can insert your own stereotypes here. With the possible exception of the folks who run ads during commercial breaks in the Maury show, this equation doesn’t bear out. From personal observation, it seems to me that good lawyers are successful because they’ve mastered the skill of knowing how to properly assess a client’s problem and, to the extent made possible by the law, taking the strongest position that serves to advance the client’s interests – both those interests of which the client himself is aware, and those he may not have yet considered. A gut instinct, informed by experience and close familiarity with the subject matter, is essential in knowing whether a case likely has enough merit to justify further action or not, and if so, what particular action is best suited to the situation. At the same time, that gut instinct is rarely if ever acted upon without additional research and analysis. It never pays to be rash … but neither does it pay to be timid or equivocal. Being able to consider and objectively evaluate other persons’ perspectives or reactions to a situation is also important, because it allows you to anticipate your opposition. It always pays to be prepared.

I was actually surprised when I realized that this was, in fact, an apt description of my approach to decision-making. I’ve always considered myself to be an odd dichotomy of impulsive and cautious, but it strikes me now that these are simply two facets of a process that is more than the sum of its parts. And it explains so much. I hate having to make decisions on the fly; on the other hand, once I make a decision, I’m usually able to rationalize it enough to be happy with it no matter what. I also hate pronouncing opinions on matters that are not cut-and-dried, because I dread the possibility that I might miss an important consideration only to have someone come along and point out the resulting flaw in my argument. Getting it “wrong” – whether due to a failure of logic, foresight or knowledge – is practically a phobia of mine. I hate generalizations; toss a sweeping statement of so-called fact my way, and my inner devil’s advocate bristles up like a riled bull before a red flag. With that said, you’re probably never going to meet an audience more inclined to be attentive to your side of the story, at least as long as it’s at least somewhat articulate and logical. Note that I said “attentive”, and not receptive. If I don’t ultimately agree with you, I will probably (politely) tell you why, point by point. I love a stimulating dialogue about as much as I hate ad hominem attacks – the last refuge of a person defeated by logic.

Giving some thought to the mind I bring to my decision-making has helped to make me more aware of both its strengths and limitations. For example, it’s made me realize that I need to be less hesitant about voicing an off-the-cuff opinion that I may later have to revise, if only because it can lead to an exchange of ideas that can enrich the discussion of the issue and help me better understand and articulate my own thoughts. It’s also made me understand that the decisions I’m less comfortable with always involve considerations of facts or theories that I have no means of knowing, understanding or verifying; in order to avoid paralysis of choice in those cases, I am going to have to up my “risk tolerance” and act more impulsively – within reason, of course.

Your turn: what mind do you bring to a problem?