on circling back and extending logic
- Risako
- Sep 20
- 7 min read
Three weeks into grad school, I decided that “Put differently…” “By this logic…” in academia corresponds to “circling back to” “putting a pin in it” in Corporate America. I wouldn’t necessarily put them in the same bucket, they’re just vaguely overused and an easy target for jokes. But they are also functional: sometimes, you do need to circle back or piggyback onto someone else’s logic, filler phrases can buffer interpersonal dynamics, and equivocating can buy you precious time.
My parents always told me “it’s easy to criticize,” and unfortunately for me, I have to give them another point in the “turns out you were right” scoreboard. But their advice didn’t really come with any particular alternatives, so I’ve had to figure that part on my own. Because my god, there is just so much to criticize right now.
Closing the Loop
There’s seems to be a bit more self-awareness from those In corporate spaces: Tweets using it out of place make us feel like we’re in on some joke while the short form media skits offer an easy, mindless laugh. From what I’ve experienced (admittedly little), there seems to be less of this in academia, mostly a passivity about being socialized into conventions of The Academy. (As a sociologist recently told me something along the lines of: “sociologists are just like poodle owners, subscribing and embodying the status-driven behavior and stratification that they are supposedly studying.”)
It’s probably some coping mechanism, aimed to distance ourselves from capitalism or status-driven institutions. And if I had a nickel for every time I myself made these jokes to get a laugh or perform self-deprecation, oof.
But I’ve started to find these critiques a little… lazy. It’s unclear exactly what we’re critiquing when we laugh at people “putting a pin in things.” And it’s one thing to highlight the inaccessibility of industry-specific jargon, but I don’t know if sanitizing it is the answer either. I certainly don’t think it’s wrong to be self-aware and/or critical and I know that these jokes aren’t pretending to be anymore than what they are. Nonetheless, it’s made me pause and think about how I engaged with these takes. At best, I got a laugh, and at its worst, I tried to compensate for existing as part of these spaces, preemptively putting down… something.
This stands in contrast to the fact that I have actively–particularly when entering these spaces for the first time–incorporated these phrases into my vernacular. I wanted to partake in “pinging” people and it felt kind of fun to be part of a “quick sync to align our priorities.” I think that’s the whole point of languages; I remember adopting slang to signal not just my linguistic but cultural and social fluency at the local elementary school in Japan over the summer.
In Art Worlds (1982) Howard Becker, one of the most influential contemporary sociologists, defines conventions as the tacit norms (practices, understandings, behaviors) that members comply with in order to simplify coordination. More specifically:
Members of art worlds coordinate the activities by which work is produced by referring to a body of conventional understandings embodied in common practice and in frequently used artifacts. The same people often cooperate repeatedly, even routinely, in similar ways to produce similar works, so that we can think of an art world as an established network of cooperative links among participants. If the same people do not actually act together in every instance, their replacements are also familiar with and proficient in the use of those conventions, so that cooperation can proceed without difficulty.
The mere existence of conventions turns everything–from individual techniques and skillsets, aesthetic styles, processes, and practices–into a social interaction, even without the presence of other actors. It certainly was for me when I first joined spaces, and honestly even as I shifted to what felt like some progression in my awareness. Engaging in such conversations doesn’t actually undermine a system as much as further socialize you into it. As they say, the opposite of love is indifference, not hate.
By That Logic
In an effort to be less cynical or self-deprecating about why academics haven’t really engaged in a similar flavor of jokes, an alternative interpretation could be that traditional phrases have enough linguistic or rhetorical purpose that expansive usage doesn’t undermine its value. We can see this with the one-word transitions: “Additionally” and “However” are part of the everyday vernacular, and while a pretentious asshole could mis/overuse “Corresponsingly…” “Conversely…” “Notwithstanding…,” it doesn’t necessarily interfere with my usage. If anything, it serves as an important reminder for me to be more intentional, making sure I’m not granting it some specious sense of power “just because it sounds right.”
Furthermore, its longer cousins like “One might reconceptualize this as…” or “The implication, then, is that…” might sound annoying but still carry rhetorical power. Once again, it could be easily abused by students trying to hit some word count but that doesn’t particularly impact whether I use it or not. Rather, the issue emerges when the rhetorical and logical begin to come at the cost of one another. “Extending that logic…” allows you to hand-hold your reader through your argument, but redundancy can lead you right to a needlessly circuitous route that risks letting your reader/audience getting distracted or even stray off the path.
As a reader, I’ve found myself confusing rhetorical signposts with logic. Even if it’s not intentionally misleading, sometimes A doesn’t actually equal B or X logic can’t actually be extended to Y. Identifying these weak links or gaps is important not just to assess an argument, but also because it offers hints for where you can expand, strengthen, and of course, undermine, thereby letting you join in on the conversation.
These realizations have been tremendously helpful for my own thinking and writing. Learning new words and phrases not only equip you with more options, it expands the scope of your thinking. Whether as a reader or writer, deliberating between two phrasings or wordings prompts further engagement. It’s yet another dimension of the “writing to think” process that motivated this blog in the first place. (Case in point: “deliberating” felt like it would offer a more authentic engagement and genuine process of going back and forth that “debating” didn’t feel capable of evoking in that particular sentence.) It could so easily tip from useful to insidious, obscuring rhetorical gaps not just for the reader but the writer. But rhetoric and logic aren’t a zero-sum game, and each can bolster the other if used carefully.
So then.... Luddites: Disciplined or Lazy?
It’s been a slow process to actually put these reflections into practice, especially when there’s just so much actually wrong with the world. But I think I saw the first flickers of this coming to fruition in a recent conversation with a friend, where we were talking about what it means to engage with AI critically.
Most critique but still use ChatGPT, others wholeheartedly acknowledge their reliance on it, and then there are those who seem to truly abstain from it in solidarity with, say, climate change. And while they’re not wrong, the choice feels a little lazy. Not the people rightfully pointing out its environmental impact because they are absolutely right. But because climate change affects everything, and quite literally every single move and choice we make really only exacerbates what’s going on. Maybe lazy isn’t the right word and I’m stretching to make a connection to #corporatetok, but I think pitting AI and the planet against one another might allow us to feel like we’re achieving more than we might be.
Rather, I think it’s important to make sure that we’re interrogating AI directly and on its own. Because the AI/climate change duality just makes it one of many cases of human greed at the cost of our collective good or planet. It’s only just a bit more diabolical than other cases, but frankly not at the top of the list and most certainly just one of many. Same holds for the claim that kids are getting more stupid. ChatGPT is not the only adversary. We’ve been losing both our critical thinking skills and our attention span since the TV made its way into the average household, only to be followed by computers, social media, and smartphones.
Instead, I’ve been trying to focus more on how AI is impacting us at an infrastructural level. As someone who: 1) quickly adopted “circling back,” 2) equally quickly joined the campaign against it, 3) still “extend this logic,” 4) often let these logical signposts guide my reading instead of thinking through it, and still 5) mock those using it despite it all, my thinking is still full of lazy critiques and distractions. But as we seem to be losing our right to free speech in front of our own eyes, I have to at least be able to recognize when there’s a facade of resistance or logical thinking in order to actually engage in it.
There’s been something very specific about turning 26, a milestone that I was prepared to be bothered by, but one that ended up being underwhelming. But I feel like I've unlocked a… new way of engaging with my past self. Until this point, it was assumed that my present self had more knowledge and life experience than any past version: 21 year old me had a lot of insights and advice to pass onto my 15 year old self, running the gamut from productive to stupidly reactionary. But in your mid-20s, evolving and growing has to be a conscious choice, and a bumpy road at best. If my 26-year-old self can no longer assume progress since my 23-year-old self by virtue of time, changes (including those made to maintain things in your life) are a reflection of a series of conscious choices. And I'm getting a little better at seeing even things I regret as necessary steps.
This reflection feels a little divorced from the rhetorics of “self-love,” “being kind to yourself,” and “caring for your inner child” and more like a matter of fact. Because whatever I unlocked doesn’t feel all that good. It’s exhausting to choose to grow. Fuck being an agentic actor. Also, being able to converse with your past self on a equal footing may be a slippery slope towards getting stuck In time. The best description I can whip up right now, 26 years and almost a month into life, is that it’s a funny feeling.
Getting older–and not even “getting old” or “aging” yet–still feels scary to me, and I’ve been reluctant to let go of the fact that I still feel 23 or 24. And I feel ill-equipped to enter my 30s despite being excited about it. What’s a girl to do (0:12)?
Comments