Skip to content

I Built A Bot To Apply To Thousands Of Jobs At Once–Here’s What I Learned | Fast Company

I Built A Bot To Apply To Thousands Of Jobs At Once–Here’s What I Learned

As this job seeker’s “faith in the front-facing application process eroded into near oblivion,” a lower-tech strategy took its place.

I Built A Bot To Apply To Thousands Of Jobs At Once–Here’s What I Learned
PHOTO: FLICKR USER KAI SCHREIBER

I have a great job, and there’s no rush to leave. As a director at a national nonprofit I’ve built some fantastic teams, but over the past year they’ve gotten so good at what they do that I’ve begun to wonder whether I’m still needed. So I started slowly casting about for new challenges, initially by applying (perhaps naively) to openings at well-known tech companies like Google, Slack, Facebook, and Squarespace.

Two things quickly became clear to me:

  1. I’m up against leaders in their field, so my resume doesn’t always jump to the top of the pile.
  2. Robots read every application.

The robots are “applicant tracking systems” (ATS), commonly used tools for sorting job applications. They automatically filter out candidates based on keywords, skills, former employers, years of experience, schools attended, and the like.

As soon as I realized I was going up against robots, I decided to turn the tables–and built my own.

HOW I BUILT A JOB-APPLICATION MACHINE

I’m no engineer but I play with technology a lot. I’ve been known to find ways to automate things (social media, data processing, web content, etc.) out of boredom or creativity or both. So I cobbled together a Rube Goldbergian contraption of crawlers, spreadsheets, and scripts to automate my job-application process, modestly referring to it as my “robot.”

My robot aggregated hiring managers’ contact information, then submitted customized emails with my resume and a personalized cover letter. Soon, I was imagining myself telling the story of how I’d turned my job search into a super-precise job firehose.

I tracked how many times my cover letter, resume, or LinkedIn profile was viewed. I also tracked email responses (including from autoresponders). It wasn’t a particularly elegant mechanism, but it was ruthlessly efficient. The first time I fired it up I accidentally applied to about 1,300 jobs in the Midwest during the time it took me to get a cup of coffee across the street. I live in New York City and had no plans to relocate, so I quickly shut it down until I could release a new version.

After several iterations and a few embarrassing hiccups, I settled on version 5.0, which applied to 538 jobs over about a three-month period.

NOT EVEN ROBOTS READ RESUMES

To cut to the chase, it didn’t work. I’m still looking for the right gig–and it may not shock you to hear that my robotized approach hasn’t paid off.

But before you remind me that I did exactly what every career coach and recruiter tells jobseekers not to do, hear me out: I wasn’t just blasting the same content to every imaginable job listing–far from it. I tested different email subject lines, versions of my resume, and cover letters. I built my robot in order to adjust and optimize as many variables as possible when applying to each new job, just like an individual might, one application at a time.

But while I saw some variation in response, there wasn’t much. It seemed like nothing made a difference in actual human reads. One A/B test used a normal-looking cover letter and contrasted it with a letter that admits right in the second sentence that the email was being sent by a robot:

I thought this A/B test of my cover letter would have yielded significantly different responses. It didn’t. (View full size image here)

Now, one of those letters should have performed either a lot better or a lot worse than the other. For my purposes, I didn’t care which; I just wanted to stand out from all the other applicants. But it didn’t seem to matter because, as far as I could tell from this experiment and others like it, nobody reads cover letters–not even other robots like ATS algorithms.

Related: Cover Letters Are Dead–Do This Instead

When they were opened, my robot-generated letters performed a little better, but just barely.

By targeting internet companies in particular, I’d chosen an industry with a high likelihood of reliance on resume-processing algorithms. And without the tech pedigree (and corresponding keywords) to sneak by those filters, I had a steep hill to climb, robot or no robot.

Friends were quick to point out the obvious reason that this approach wasn’t working. Most told me I had to know someone who would pass my resume along to a hiring manager. By trying to game that system, I inadvertently learned how powerful it really is. One 2014 study found that 30%–50% of hires in the U.S. come from referrals, and referred candidates are over four times more likely to be hired than non-referrals. According to one hiring consultant’s estimate, referrals lead to a whopping 85% of critical jobs being filled.

Referrals are the minority of applicants but are five times more likely to be hired.DATA: BROWN ET AL.

IT’S NOT JUST A “NUMBERS GAME”–IT’S MANY NUMBERS GAMES

But even if companies give preference to employee referrals, they must be on the lookout for candidates with unique qualifications–or so I thought.

I asked Scott Uhrig at Agile.Careers, a coaching program for high-tech executives, how a nontraditional candidate would fare in a fiercely competitive job market. He explained that it’s easy to find candidates that fit cleanly within a mold. Beyond that, though, “recruiters are usually not very helpful; they are looking for candidates in the center of the bullseye.” From his vantage point, recruiters don’t have time to search for something outside the norm.

Amy Segelin, president of the executive communications recruiting firm Chaloner, put it a different way: “Out-of-the-box hires rarely happen through LinkedIn applications. They happen when someone influential meets a really interesting person and says, ‘Let’s create a position for you.’”

So that was two strikes against the time-honored tradition of submitting a resume and crossing your fingers. But I wasn’t just handing over a resume. I was handing over a lot of resumes. The law of large numbers suggests that something should get through the ATS and stand out, even next to candidates whose buddies bumped their resumes up to the top of the pile.

Uhrig explained that there was another numbers game at play, too. “Roughly 80% of jobs are never posted–probably closer to 90% for more senior jobs,” he told me. “The competition for posted jobs is insane. ATSes do a horrendous job of selecting the best candidates, and–perhaps most important–the best jobs are almost never posted.”

Other recruiters I’ve spoken to since running my robo-experiments suggested that most positions on job boards were either posted by an HR person who’s since changed jobs, or they have already been filled. (Or, in the case of a lot of tech companies, they’ve already decided to hire someone on an H1B visa but need to post the position to fulfill requirements.)

In short, it doesn’t matter if you submit two, three, or 10 times as many applications as the average candidate–they’re rarely going to work out in your favor, for factors beyond your (or your robot’s) control.

LESS APPLYING, MORE NETWORKING

So where has this left me, aside from somewhat disheartened? Well, for one thing, it leaves me a little bit wiser. As my faith in the front-facing application process eroded into near oblivion, I learned three lessons by robotically applying to thousands of jobs:

  1. It’s not how you apply, it’s who you know. And if you don’t know someone, don’t bother.
  2. Companies are trying to fill a position with minimal risk, not discover someone who breaks the mold.
  3. The number of jobs you apply to has no correlation to whether you’ll be considered, and you won’t be considered for jobs you don’t get the chance to apply to.

Maybe I didn’t need an elaborate bot-driven scheme to find that out. And maybe, somewhere along the way, I became more interested in what the data says than in whether or not a robot could actually find me a job. But the project wasn’t entirely without success. Forty-three companies ultimately reached out for follow-up interviews, and I actually talked to about 20 of them. In virtually every case, though, the companies were on the smaller side (less than 50 staff) and not a single one had an ATS in place to filter resumes.

I’ve been transparent with almost all of the interviewers about my process, and while I worried it might be a real turnoff, they’ve all responded positively so far; I’ve even landed a few consulting gigs from it. But in the meantime, I’ve given up on applying for jobs the old-fashioned way–both manually and robotically. I’m now scaling back my nonprofit role to three days a week and taking some time to meet interesting people in person and see what I can learn from there. Eventually, I’m hoping, one of those interesting people is going to ask for my resume so they can put it on top of a pile somewhere.

Source: I Built A Bot To Apply To Thousands Of Jobs At Once–Here’s What I Learned | Fast Company

The Batman Equation

https://www.google.com/search?hl=en&output=search&sclient=psy-ab&q=2+sqrt(-abs(abs(x)-1)*abs(3-abs(x))%2F((abs(x)-1)*(3-abs(x))))(1%2Babs(abs(x)-3)%2F(abs(x)-3))sqrt(1-(x%2F7)%5E2)%2B(5%2B0.97(abs(x-.5)%2Babs(x%2B.5))-3(abs(x-.75)%2Babs(x%2B.75)))(1%2Babs(1-abs(x))%2F(1-abs(x)))%2C-3sqrt(1-(x%2F7)%5E2)sqrt(abs(abs(x)-4)%2F(abs(x)-4))%2Cabs(x%2F2)-0.0913722(x%5E2)-3%2Bsqrt(1-(abs(abs(x)-2)-1)%5E2)%2C(2.71052%2B(1.5-.5abs(x))-1.35526sqrt(4-(abs(x)-1)%5E2))sqrt(abs(abs(x)-1)%2F(abs(x)-1))%2B0.9&pbx=1&oq=2+sqrt(-abs(abs(x)-1)*abs(3-abs(x))%2F((abs(x)-1)*(3-abs(x))))(1%2Babs(abs(x)-3)%2F(abs(x)-3))sqrt(1-(x%2F7)%5E2)%2B(5%2B0.97(abs(x-.5)%2Babs(x%2B.5))-3(abs(x-.75)%2Babs(x%2B.75)))(1%2Babs(1-abs(x))%2F(1-abs(x)))%2C-3sqrt(1-(x%2F7)%5E2)sqrt(abs(abs(x)-4)%2F(abs(x)-4))%2Cabs(x%2F2)-0.0913722(x%5E2)-3%2Bsqrt(1-(abs(abs(x)-2)-1)%5E2)%2C(2.71052%2B(1.5-.5abs(x))-1.35526sqrt(4-(abs(x)-1)%5E2))sqrt(abs(abs(x)-1)%2F(abs(x)-1))%2B0.9&&cad=h

Dragon Curve – Numberphile – YouTube

Recreating VSCO Filters in Darkroom – Dispatches From Bergen – Medium

Recreating VSCO Filters in Darkroom

Note: If you want to eat the fish, without learning how to fish, feel free to jump ahead to the “How to Recreate a VSCO Filter” section below. But where’s the fun in that?


Introduction

Anyone passionate about photography is familiar with the feeling: You go on a trip, you take heaps of photos every day, then at some point you go through them, either piecemeal or all at once, and you try to identify which you want to edit and what you want them to look like, then you get to work.

What I quickly realized when I was going through that routine two years ago, was how repetitive the process was. I knew I wanted to use VSCO’s M5 filter. It was by far my favorite because of what it did to yellows/greens/blues, but it had some quirks I didn’t care for. It was too warm, and it crushed my highlights.

Putting aside the amount of work involved in identifying which of the photos I wanted to edit in the first place and how much work it took to import them, anyone familiar with VSCO understands the pain of how much work it is to edit multiple photos. I knew there was no technical reason why such a constraint and ineffeciency was necessary. Further, I knew that VSCO’s filters (And all the other filter apps, for that matter), simply operated on the premise of LUTs (Look-Up Tables). Before we continue, I think an understanding of what a LUT is and how filters work will really help demystify them.

A Explanation of Look-Up Tables

Without getting into too many details (Look up the terms for a deeper understanding, no pun intended), the basic premise behind how LUTs work is that a simple image is generated, covering every possible color in the RGB color space. For those unfamiliar with the RGB colorspace, a quick two sentence explanation goes something like this: Color Spaces can be represented as 3-dimensional shapes that contain all colors. RGB is represented as a cube, with each side ranging in value from 0 to 1.

The RGB color space. Image via https://en.wikipedia.org/wiki/RGB_color_space

What a Look-Up Table does is it slices that RGB cube into thin slices, and arranges them into a flat image. The flat image is your table in which you “Look up” colors based on their location in the image (How a color relates to a location in the image depends on how you sliced the cube and how you arranged the slices).

An unedited 3-D LUT. Image via https://github.com/BradLarson/GPUImage/blob/master/framework/Resources/lookup.png

How Filters Were Made (Before Darkroom)

So, now that you know what a color space is, and you know what a look-up table looks like, we can get back to filters.

I’m not sure what they’re technically called, but for the sake of this article, let’s call them filter artists.

What a filter artist does traditionally, is that they open a sample photo, and then they manually edit it to accomplish a look they’re happy with. They might have a bank of images that contain multiple colors and multiple tones, so they tune the edits to each subject matter, but they’re using Photoshop tools to manipulate the image.

Once they’re happy with how their edits are affecting their bank of images, they save those edits, and they apply them to the unedited LUT, which looks a lot like the sample I showed you.

That edited LUT is suddenly valuable. It encodes all the edits of the filter artist, and it defines the filter. Remember how the RGB cube represents all the colors possible in RGB? Well, because the LUT is generated from the cube, the LUT also contains every color. Now, this is where it gets tricky.

The brilliance of the LUT image above isn’t that it contains every single color. The brilliance of the LUT is that it contains two sets of information. This nugget of information is crucial to understand. The LUT image obviously contains the colors in the pixels of the image. The second set of information a LUT image contains is how the location of the pixels in the image relate to colors as well.

Here’s an example: The top-left pixel in the LUT I shared earlier is at location (0,0). That pixel, is black. Those are the two important pieces of information. We know for a fact, that whatever pixel is at location (0,0), it was black in the unmodified LUT. If that LUT is passed through Photoshop and the shadows of a photo were brightened, then that black pixel is no longer perfectly black, it’s a little gray. There’s your two pieces of information! The knowledge that location (0,0) should be black, means the actual color at that pixel can be different. That before/after is your mapping of the filter.

When an app like VSCO ships, it ships with a set of LUTs that have been passed through a series of editing steps that make up the individual filters. When you apply one of those filters onto one of your photos, the app goes through your photo pixel by pixel and look up the color of that pixel. If that color is black, it knows to look at the LUT at location (0,0) because that’s where the original black color in the LUT should be, and it reads what color is that location in the LUT. The app then replaces the original color of the image with the one in the LUT. Ta-da! You just applied the M5 filter to your photo.

The Limitations of the LUT approach

Congratulations, you now know how to build a photo editing application. The simplicity of this approach is why there are so many photo editing applications on iOS, and why some of them have so many filters. Each app can be slightly different from the others by adding a feature here or there, or by hiring really good filter artists, but they’re all fundamentally the same:

Import > Open > Apply Filter > Save

From an app developer’s perspective, this is great: Modify a LUT, send the image down to people’s devices, and they have a new filter! Charge them a few dollars for it, pop some champagne. From a photographer’s perspective however, this is less than ideal. What happens if, like my example with M5 in the intro, the filter does not match my expectations or style? You’re out of luck. You and twenty million other people are all sharing the same LUTs, and all your photos look the same. If you want to use their auxiliary tools to fix the shortcomings of the filter, then you just introduced tons of repetitive work to your editing process.

Obviously, I’m the creator of Darkroom, a photo editing app. This is where I zoom out from the technical details of how filters work, and explain to you why Darkroom is different.

The Darkroom Difference: No Look-Up Tables

The big innovation with Darkroom was to take the same tools that the filter artists use to generate the LUTs, and to port them to your phone. In Darkroom, filters are instructions for generating the color mapping on the fly.

Here’s where this comes into play: Suppose I apply Darkroom’s A100filter on my photo, but the rich green tones in the shadows aren’t working for me. In Darkroom, I can apply the filter, then use the Curves tool to alter the filter itself. If I see myself doing the same thing repeatedly, I can save those new instructions as my very own filter. Because Darkroom skips the import flow of all the other apps, my editing flow is thus reduced to:

Open > Apply Filter > Save

Except, the Filter I’m applying is my own, containing all my custom edits.

Now that the fundamental concepts of how LUTs, colorspaces, traditional filters, and Darkroom filters all work is out of the way, let’s get back to the original point.


How to Recreate a VSCO Filter

Technically speaking, you can pass a blank LUT through any filter on any app and end up with the same LUT the app uses internally. That doesn’t really mean anything, and it isn’t really very useful. To recreate a VSCO filter in Darkroom, we’re going to need to approximate the instructions used to generate the LUT in the first place. Put another way, we have a meal, and we’re trying to guess the recipe.

To do this, we’re going to need to be familiar with the Curves tool, and the Color tool in Darkroom. The former modifies the tones, the latter modifies the colors. Since these are the primary tools used for creating filters in Photoshop and Lightroom, we will generate a specific color palette which we will use to isolate changes to those two tools.

In Darkroom, the Curves tool is divided into five regions (Blacks, Shadows, Midtones, Highlights, and Whites). That’s why the first row in the palette is divided as it is. They’re desaturated, because we want to isolate the impact of tonal adjustments from color adjustments.

The Color tool however is divided into eight color channels, represented here in the second row.

Step 1: Match Tones

Simply download that palette to your phone, import it to VSCO, and pass it through your favorite filter. In this case, we’ll be recreating F2.

Next, move the edited image back to your computer using AirDrop. We’re going to need to read the values of those pixels. There are lots of tools available for doing just that, my favorite is xScope.

Open the edited image, and read the values of the gray boxes in order. Here’s what it looks like to use xScope on the Black square, and how the value appear:

You can see on the left, the values of R:0.13 G:0.14 B:0.17 Those numbers reflect the impact of the filter on black colors, absent any color-specific adjustments. Switch to Darkroom and go to the Curves tool, apply those numbers to the Red, Green, and Blue curves respectively:

You can see how by the time I added the Blue curve adjustment, the black square already matched the VSCO-edited palette.

Now, repeat the process for the four other tone regions:

Et Voila! We’ve already gotten pretty far!

This is the end of the robotic part of the process. Now we’re on to the subjective and more intuitive part of the process.

Step 2: Match Colors

After matching the Red, Green, and Blue curves in Darkroom, export the palette onto your computer. Name the two images (VSCO F2 and Darkroom F2) so you can differentiate them. Open both in a photo viewing app (I’m using Sketch.app, and compare:

You can see the top bar matches closely, but the colors at the bottom are off. The colors in the VSCO filter appear less saturated across the board, and they appear darker across the board as well. This is where we start experimenting. Since we have no way of knowing what the recipe is, we have to keep guessing until we get close enough.

It’s important to remember when you’re doing this activity that your goal isn’t to recreate the VSCO filter exactly. You want to emulate the VSCO filter’s character. It doesn’t have to match. Looking at the palette, the difference is huge, and that’s good, because we’re using it as a tool, yet even without adjusting for the colors, the curves get us most of the way through to the final character of the F2 filter:

Can you tell which one is VSCO’s F2 filter and which one is the work-in-progress Darkroom emulation? Darkroom is on the right.

There are some differences, notably the saturation of the shadow that the side mirror is casting on the door, but it’s the same in character.

With a little bit more work, we can get much closer though. Since saturation appears to be low across the board, let’s knock it down in Darkroom’s Basic Adjustments tool (The default one with all the sliders). We don’t know how much to adjust it, so it could take a couple of back-and-forths of AirDrop’ing the updated palette and comparing again.

Now, we can see that the Red, Purple, and Pink colors are fairly close, but the Yellow and Green colors are noticeably darker in VSCO, and the blue appears even more desaturated.

To fix the color-specific channels, go to the Color tool, and adjust the Saturation and Luminance of the ones with differences. Again, it might take a few iterations to get it right, but take your time, you’ll see progress quickly!

Here are the changes I ended up making to match:

And here’s how they stand next to the original VSCO Palette:

Quite close! Not bad for a few minutes of work. But, there’s one crucial step left!

Step 3: Matching The Character

When I was making the edits, I was uncomfortable by how hard I had to push the luminance on the yellow and green channels to match the VSCO palette. I wanted my first test to be with a green-heavy photo:

VSCO left, Darkroom right

Just as I had suspected, the greens and yellows are far too dark.

The thing to remember, is that Darkroom is a tool. Toolmakers, by their very nature, have to make decisions along the process of building tools that impact the behavior of the tools. For some things, standard mathematical definitions exist to define a tool. Those tools behave identically everywhere. Most of the time however, the tools behave subjectively. For example, a developer building a Saturation tool needs to define how colors get desaturation. We have an intuitive understanding of how it works, but at the end of the day, what does a desaturated yellow look like, and how does it look different from a desaturation red?

That’s where the conversation of character of a filter comes back. We want to match the filter by feel, not by technicality.

I upped the luminance of the green and yellow, et voila!

It’s still not exact, to be clear, but it doesn’t have to be, not should it be. This is a base, a foundation, on which you build, iterate, make your own. You have learned how to fish, now go catch a big one and feed your whole photography family!


Conclusion

I’ve uploaded the filter created in this tutorial to our server so you can download it and play with it. I named the filter “Charlie” after the Twitter user who turned me onto this rant in the first place. If you have the Darkroom app installed on your phone, tap the link below and it’ll open the app and install the filter.

To install the Charlie filter, paste this in MobileSafari on a phone with Darkroom installed:

darkroom:///install_filter?id=547

p.s. One last tip when doing something like this: I suggest creating filters every time you export a photo with changes from Darkroom, and matching the file name to the name of the filter. I number my attempts like commits in a git code repository. It took 9 attempts to match this filter. Enjoy the process!

Source: Recreating VSCO Filters in Darkroom – Dispatches From Bergen – Medium

The world is in greater peril from those who tolerate or encourage evil than from those who actually commit it.

Albert Einstein

You May Want to Marry My Husband – The New York Times

I have been trying to write this for a while, but the morphine and lack of juicy cheeseburgers (what has it been now, five weeks without real food?) have drained my energy and interfered with whatever prose prowess remains. Additionally, the intermittent micronaps that keep whisking me away midsentence are clearly not propelling my work forward as quickly as I would like. But they are, admittedly, a bit of trippy fun.

Still, I have to stick with it, because I’m facing a deadline, in this case, a pressing one. I need to say this (and say it right) while I have a) your attention, and b) a pulse.

I have been married to the most extraordinary man for 26 years. I was planning on at least another 26 together.

Want to hear a sick joke? A husband and wife walk into the emergency room in the late evening on Sept. 5, 2015. A few hours and tests later, the doctor clarifies that the unusual pain the wife is feeling on her right side isn’t the no-biggie appendicitis they suspected but rather ovarian cancer.

As the couple head home in the early morning of Sept. 6, somehow through the foggy shock of it all, they make the connection that today, the day they learned what had been festering, is also the day they would have officially kicked off their empty-nestering. The youngest of their three children had just left for college.

So many plans instantly went poof.

No trip with my husband and parents to South Africa. No reason, now, to apply for the Harvard Loeb Fellowship. No dream tour of Asia with my mother. No writers’ residencies at those wonderful schools in India, Vancouver, Jakarta.

No wonder the word cancer and cancel look so similar.

This is when we entered what I came to think of as Plan “Be,” existing only in the present. As for the future, allow me to introduce you to the gentleman of this article, Jason Brian Rosenthal.

He is an easy man to fall in love with. I did it in one day.

Let me explain: My father’s best friend since summer camp, “Uncle” John, had known Jason and me separately our whole lives, but Jason and I had never met. I went to college out east and took my first job in California. When I moved back home to Chicago, John — who thought Jason and I were perfect for each other — set us up on a blind date.

It was 1989. We were only 24. I had precisely zero expectations about this going anywhere. But when he knocked on the door of my little frame house, I thought, “Uh-oh, there is something highly likable about this person.”

By the end of dinner, I knew I wanted to marry him.

Jason? He knew a year later.

I have never been on Tinder, Bumble or eHarmony, but I’m going to create a general profile for Jason right here, based on my experience of coexisting in the same house with him for, like, 9,490 days.

First, the basics: He is 5-foot-10, 160 pounds, with salt-and-pepper hair and hazel eyes.

The following list of attributes is in no particular order because everything feels important to me in some way.

He is a sharp dresser. Our young adult sons, Justin and Miles, often borrow his clothes. Those who know him — or just happen to glance down at the gap between his dress slacks and dress shoes — know that he has a flair for fabulous socks. He is fit and enjoys keeping in shape.

If our home could speak, it would add that Jason is uncannily handy. On the subject of food — man, can he cook. After a long day, there is no sweeter joy than seeing him walk in the door, plop a grocery bag down on the counter, and woo me with olives and some yummy cheese he has procured before he gets to work on the evening’s meal.

Jason loves listening to live music; it’s our favorite thing to do together. I should also add that our 19-year-old daughter, Paris, would rather go to a concert with him than anyone else.

When I was working on my first memoir, I kept circling sections my editor wanted me to expand upon. She would say, “I’d like to see more of this character.”

Of course, I would agree — he was indeed a captivating character. But it was funny because she could have just said: “Jason. Let’s add more about Jason.”

He is an absolutely wonderful father. Ask anyone. See that guy on the corner? Go ahead and ask him; he’ll tell you. Jason is compassionate — and he can flip a pancake.

Jason paints. I love his artwork. I would call him an artist except for the law degree that keeps him at his downtown office most days from 9 to 5. Or at least it did before I got sick.

If you’re looking for a dreamy, let’s-go-for-it travel companion, Jason is your man. He also has an affinity for tiny things: taster spoons, little jars, a mini-sculpture of a couple sitting on a bench, which he presented to me as a reminder of how our family began.

Here is the kind of man Jason is: He showed up at our first pregnancy ultrasound with flowers. This is a man who, because he is always up early, surprises me every Sunday morning by making some kind of oddball smiley face out of items near the coffeepot: a spoon, a mug, a banana.

This is a man who emerges from the minimart or gas station and says, “Give me your palm.” And, voilà, a colorful gumball appears. (He knows I love all the flavors but white.)

My guess is you know enough about him now. So let’s swipe right.

Wait. Did I mention that he is incredibly handsome? I’m going to miss looking at that face of his.

If he sounds like a prince and our relationship seems like a fairy tale, it’s not too far off, except for all of the regular stuff that comes from two and a half decades of playing house together. And the part about me getting cancer. Blech.

In my most recent memoir (written entirely before my diagnosis), I invited readers to send in suggestions for matching tattoos, the idea being that author and reader would be bonded by ink.

I was totally serious about this and encouraged submitters to be serious as well. Hundreds poured in. A few weeks after publication in August, I heard from a 62-year-old librarian in Milwaukee named Paulette.

She suggested the word “more.” This was based on an essay in the book where I mention that “more” was my first spoken word (true). And now it may very well be my last (time shall tell).

In September, Paulette drove down to meet me at a Chicago tattoo parlor. She got hers (her very first) on her left wrist. I got mine on the underside of my left forearm, in my daughter’s handwriting. This was my second tattoo; the first is a small, lowercase “j” that has been on my ankle for 25 years. You can probably guess what it stands for. Jason has one too, but with more letters: “AKR.”

I want more time with Jason. I want more time with my children. I want more time sipping martinis at the Green Mill Jazz Club on Thursday nights. But that is not going to happen. I probably have only a few days left being a person on this planet. So why I am doing this?

I am wrapping this up on Valentine’s Day, and the most genuine, non-vase-oriented gift I can hope for is that the right person reads this, finds Jason, and another love story begins.

I’ll leave this intentional empty space below as a way of giving you two the fresh start you deserve.

With all my love, Amy

Flatland II: A New Series of Dramatically Skewed Photographic Landscapes by Aydin Büyüktas | Colossal

Turkish digital artist and photographer Aydin Büyüktas continues his dizzying landscape series Flatland with this new collection of collages shot in various locations around the world. Each image requires around 18-20 aerial drone shots which are then stitched together digitally to form sweeping landscapes that curl upward without a visible horizon. As we’ve noted before, Büyüktas found inspiration in a century-old satirical novel titled Flatland about a two-dimensional world inhabited by geometric figures. You can see more from the series on his Facebook page.

Source: Flatland II: A New Series of Dramatically Skewed Photographic Landscapes by Aydin Büyüktas | Colossal