Introduction

Diving into the 'Ask HN: Who is hiring?' thread became a recurring ritual, albeit a time-consuming one. Each month, the troves of job offers posted within proved to be an arduous task to scrutinize. The inconsistency in formatting across these posts only compounded the challenge. Filtering for a specific role, such as a 'Ruby Senior Developer' in a 'full remote' capacity within the 'CEST time zone,' yielded only a handful of matches among the myriad entries.

Time proved to be my adversary as well. Commencing the scrutiny of these posts inevitably led to missing out on newly updated listings, creating an ever-growing gap in my job search. Amidst this deluge, undeniably promising job opportunities surfaced. I found myself yearning for a more efficient method to sift through, extract, organize, and precisely filter these job offers."

The Idea

Facing the daunting task of parsing through these endless job offerings, one question loomed large: 'Who or what could rescue me from this repetitive drudgery?' Enter AI to the rescue. The buzz surrounding Large Language Model (LLM) capabilities, particularly the surge in interest following the release of ChatGPT towards the end of 2022, became increasingly hard to ignore.

It dawned on me that this cutting-edge technology could be the perfect ally in this quest, serving as the ideal solution to not only assist in perusing job offers but also meticulously handpicking the ones that aligned with my criteria. Leveraging AI for this purpose seemed a natural fit, ensuring the extraction and standardized formatting of the sought-after job details.

Development Process

The seed of the idea was sown almost immediately after the release of ChatGPT. However, turning that concept into a functional reality demanded numerous iterations. It began with a clear vision of the desired end product - the structured output of data. Yet, the path to orchestrating this data in a user-friendly manner, resembling the intuitive interaction facilitated by ChatGPT, remained shrouded in uncertainty.

The rapid evolution within the AI development landscape over the past year was striking. From the groundbreaking advancements from models like Davinci 003 API to the GPT 3.4 chat API and the transition from custom chains of thought to the adoption of OpenAI's standardized functions, the pace of change was relentless.

After multiple iterations, I settled on a simplified approach using GPT 3.5 Turbo for its cost-efficiency and integrated OpenAI functions for their streamlining of code and logic through prompts.

The user interface design went through its own evolutionary journey. Initially, I experimented with a single text input, expecting users to input all the information at once. This method involved a convoluted request: 'Read job posts from this URL, then extract compensation, location, and put this information into Google Sheets.'

However, this approach proved cumbersome and lacked the intuitive ease akin to ChatGPT's user experience. I restructured the user journey, breaking it down into two distinct steps, each with a dedicated UI. The first step involved engaging with a dedicated assistant chat to decide how to gather the data, while the second facilitated an interactive discussion with an AI assistant regarding the specifics of the first identified item.

This conversation's conclusions could then be universally applied across all other records, streamlining the process. The UI design drew inspiration from modern webmail interfaces such as Gmail or Outlook - where the mail account paralleled a project, emails transformed into records, and mail threads translated into record chats.

Untitled

Results and Impact