Fractured; or, A Sum
A version of me exists online. I smile out from my Instagram/ Twitter/ Facebook profile picture, eyes greeting the camera. Curly hair sizzles and statics as it meets the background, blurring into the vintagewallpaper.jpeg I’ve tried to use to block out bedroom, but slivers of fish posters eek in through the space between curls. My teeth are white, skin clear and bright. Locked into my room, hidden from covid-positive roommates, I think I look professional.
A version of me exists online. A friend posts a picture of me in pajamas, hair unkept and eyes bleary. I smile from the sidelines of prom photos, looking towards the wrong camera. I peek out from the background of my cousin’s graduation party, a crescent face staring out. I’m not tagged in any of them, all posted prior to the creation of my own social media; this version of myself is detached, distanced from my account. In these pictures, I lack control but am protected through anonymity, known only to those who know my face.
A version of me exists online. I do not exist before the age of four when the local newspaper spots me at the park eating ice cream in February. My jacket is too bulky, hands barely reaching mouth. My first dog, a husky, stands next to me, eyeing the photographer hesitantly. Months later, after the dog dies, the picture will be cut from newsprint and pasted to the wall of the ice cream shop.
A version of me exists online. When I’m younger, appearances are less frequent. My mother, trained on true crime, refused to have my likeness reproduced in mediums that she couldn’t control. Despite this, the occasional photograph in the local paper was deemed safe enough—When I’m ten, my girl scout troop is lectured by the police department, reminded that anything we post online is forever. We’re solemn when they finish, but they take our pictures anyway. The newsprint warps our features, digitized scans showing children with coal-stained faces. When I’m thirteen, I build a skiff at the Shipbuilding Museum down the road. In celebration of a semester’s work, my four classmates and I row them in the harbor. Tentatively, I push off from the shore. I am awkward, white undershirt dragged over orange shorts, but in the pictures taken before the current drags me out, before having to fight for hours to return to land, I’m safe within the confines of the photograph. Head cocked back to face the photographer on the bridge, I smile.
A version of me exists online. My grandfather dies in 2013, my grandmother in 2017. I’m housed after “survived by,” followed by a laundry list of genealogy. Later, when my grandmother’s sister falls into ancestry.com, she adds me to her tree. I am a daughter and a granddaughter, a niece and a cousin. When I die, I will become memorialized in numbers.
A version of me exists online. On TruthFinder, I’m recorded as living in Massachusetts and Virginia. I’m twenty-three, born in July of 2000. They can see that I’m still on my parent’s phone plan, that I picked stukonducttape@yahoo.com to be my first email username when I was twelve. The site warns me that I might find illicit accounts under this name, that I might have a history as an escort or be on a government watchlist. For $47.03, they’ll reveal my secrets.
A version of me exists online. On Tinder, I try to attract people holding fish for an essay I’m working on. I pick selfies where I’m wearing lipstick, where my body is adorned in dresses. My blurbs highlight a personal love for fish, namedropping arapaima and sturgeon as favorites, but I restrain myself, trying to keep things brief. I center my image around someone who listens, who will engage in text conversations without taking up an excess of space. I’m upfront with my research-based motivations and am surprised when, without the immediate opportunity of romance, I’m not seen as important enough to engage with. Despite changing myself to fit the aesthetics of the discourse, I’m unable to fully commit to the platform’s true sentiments.
A version of me exists online. Virtual storefronts remember the shape of my body, memorializing it in metadata. A Modcloth size 14, a Levi’s large. When scrolling through crochet patterns on Pinterest, ads boast their memory—yes, they were listening when I was debating between flare and bootcut jeans. The models for both styles are smiling, are at ease, but the woman showcasing the flares looks more comfortable, more confident. After hours of thoughtless consideration, I buy the pants.
A version of me exists online. After every doctor’s appointment, a visit summary is uploaded into my patient portal. I am my height and weight, my temperature and blood pressure. I become a sum of my symptoms and prescriptions, an identity constructed through diagnosis. Over time, my pain increases, and a more detailed image is constructed. Scans of my brain and cervical spine show the matter inside the self, the target of symptom, yet image doesn’t show the sharp prick of pain behind left eye, the ache in my jaw. I’m told the photographs are textbook depictions of a healthy body. Pain is only able to exist on a plane of reported testimony, in symptoms witnessed by medical professionals, in untrusted personal accounts. Pain becomes easy to hide, becomes mitigated, but my charts leak the secrets.
A version of me exists online. The small museum where I worked throughout the pandemic tries to adapt to a changing digital landscape. I become the medium, leading a camera through the historic house. I smile when describing the feat of the structure, force my face to display sympathy when describing the abduction of cat corpses. Yes, there are some things that are truly upsetting—the 2nd century child sarcophagus that a patron coated in green crayon, the shattered 18th century claviharp keys—but this Faith doesn’t display frustration. She swallows her bitterness. Smiling, she reminds viewers that the museum is accepting donations to further historic preservation.
A version of me exists online. When I become particularly worried about generative AI, I make an account with ChatGPT to test the interface myself. Hesitant, I borrow a burner number from England off a Google search, log in with the same email I use when online shopping. Yet, when prompted to input a name, I keep my own. My imaginations of the algorithm are much more personified, more customized. I envision something similar to Siri, a friendly assistant that greets you by name, that responds well to polite language. Despite working with the intent of breaking the mechanism, exposing how it poses a risk, I still feel the need to maintain a certain level of honesty if addressed by name. This version is interested in abortions and homophobia, in understandings of the Civil War and the production of napalm. She keyed her ex-boyfriend’s car because he slept with her best friend and is convinced that a demon has possessed her Kermit the Frog figurine. She’s impulsive and obsessive and manipulative. Despite her honesty, she is not greeted as herself.
A version of me exists online. Following my published pieces is a photoshopped headshot and a blurb: Faith Palermo is a [queer] writer from [Eastern] Massachusetts. She is an MFA candidate in creative nonfiction at George Mason University and the Nonfiction Editor at Phoebe. Her work has appeared in the Hyacinth Review. [You can find her on Twitter @faith_palermo.] Not captured is the scheduled panic that occurs whenever I paste my bio into Submittable. While I know that I will be given the opportunity to update it if my work is accepted, I want the image of myself that I present to have a strong first impression. Prior to submitting anything, I go to my friend’s websites and obsess over their bios, even though through repetition, I have most of them memorized. This anxiety feels misplaced; as a nonfiction writer, I should be worried about the image of myself presented through essay, but I keep finding myself stuck behind the standardization of the bio—while the essay is a place of experimentation and honesty, the bio is a place of networking and professionalism.
A version of me exists online. My Google Maps history records the places I’ve navigated to. This Faith is someone who misses a lot of turns; I know this comes from stubbornness, of knowing geography better than an algorithm, but this version of myself is seen as someone who is incapable of following instructions, as someone who is directionally challenged. A persona seeped in recalculating. This Faith is resistant to change. Based on where I spend the most time, it decides a location that must be my home. When I move to Virginia for grad school, I fight this shifted distinction; I want to be reminded of the distance between me and Massachusetts whenever I open the app, want to be reminded that my body belongs to a different place.
A version of me exists online. The students that I teach submit AI generated creative writing. In the classroom, I had anticipated this, had smiled from the front of the room when highlighting the transferable skills that writing fosters. Grading behind the screen, I become indignant; the Faith that emails them reflects this. I score them low, giving them the opportunity to contest the grade. Most do, sending messages seeped in confusion, in misunderstanding. This faceless version of myself is easier to gaslight, easier to manipulate. I run submissions through AI detectors, input similar prompting into ChatGPT itself. I download Chrome extensions to dig into metadata, tearing through copy and pasted information from the AI. When I evidence this in my reply, I tell myself that my tone is smooth, calculated. I try to keep myself from feeling too smug.
A version of me exists online. My search history is littered with targeted queries. I become obsessed with uncovering information about others virtually; when my friends begin dating, I try to find the things that potential partners keep secret. I find arrest records and speeding tickets, divorce settlements and hateful comments posted under pseudonyms. When a friend finds someone she’d like to interview for a project she’s working on, she sends me her name. I find an active email address within 10 minutes. There’s something about breaking imagined privacy that intrigues me, about wanting to see the different versions of people that they don’t want me to see.
A version of me almost exists online. One summer, I return to the museum before realizing we’ve outgrown each other. For five years, I acted in accordance with small nonprofit politics, following the motions of authority and image. Knowing when to smile and nod and when an extra second of hesitation would be enough to disagree made me trustworthy, gave me control. My understanding of this, combined with the fact that I was doing the work of four people, had convinced me that I was essential, but in the year that I’d been at grad school, everything had been able to continue just fine without me.
The progress that impacted me the most was the continuation of scholarship. There’s a sense of power derived through knowledge; for a brief moment, I was one of a handful of historians who knew the most about one of the most prolific American inventors of the 20th century. At 22, this was something that I claimed as a major point of pride. I told myself that I’d write critical essays on the providence of artifacts, on the literary relations within the queer community in 1950s Massachusetts. Due to the small size of this group, I assumed that I could contribute to the archive at my leisure, but by the time I returned, they’d already been written by the people who filled my place.
That summer, a coworker asks if I’d like to be memorialized. He explains that they’re looking to record someone in a series of poses, capture their voice reading different phonetic parts. The raw data would be uploaded to Adobe’s cloud, creating a playable character. Manipulated by the museum, she would escort guests throughout the building, educating them on artifacts of their choosing. Regardless of the museum’s changes, she would remain informed, would be regularly updated to reflect the most up to date scholarship. Silent until spoken to, she would accompany the viewer on a tour that never ends.
I imagine a version of myself that knows no rest, a version that will continue to work until the program breaks. This version would be institutionalized, a mouthpiece for sponsorship appeals and grants. Outside of the hours needed to create her, I would not be paid for indefinite labor, yet, my work would live on past my lifespan, continuing to deliver the script that I’ve recited four times a day for the past five years.
Almost immediately, I laugh. By now, I know how insecure data is; there isn’t much preventing anyone from adapting the character for their own purposes. This version of myself could be tasked with recording sexual harassment training videos for corporations or with acting in an artsy short film that can’t afford to pay human actors. Maybe someone proficient in deepfakes might return to their roots, merging this version of myself with another human, using body as a canvas for pornography.
Still, something about this nags at me. I wonder how a different version of myself would respond; would someone who hasn’t researched the inherent risk of artificial intelligence react the same way? Would a version operating exclusively on training data that does not trust online cataloguing deny the offer, or would the need to be helpful, to be intrinsically connected to an institution, outweigh the risk? Is this potential version closer to other versions than it is to me? Does it even matter?
Versions of me exist online. Versions of me exist in the present, in the future, created by previous iterations, informed by personal lineage. All of these versions are true. The distinction of the self is imagined, imperfect. Identity is a collage: colors and shapes and textures converge, and are seen more clearly through their alignment.