Show HN: Spent 450hrs to bring my CV down to 1 page (ML, AI) https://ift.tt/40JOYs2

Show HN: Spent 450hrs to bring my CV down to 1 page (ML, AI) 10 second version: Get 10000feet view of job descriptions - https://ift.tt/5LIdytN 2 minute version : Hi HN, Long time lurker, big fan, and first time poster inspired by how this community elaborates on ideas and new products. Recently was given feedback that my CV was too long at 2 pages, I was at loss as to how to update it without having a high-level view of the requirements of the type of jobs I would be interested in. So I built https://ift.tt/5LIdytN to help me study relevant job requirements categorized by seniority, salary and keywords. I then used my site to update my own CV! Overall the whole process was far simpler than I thought it'd be and the work looked like below: 0. study corpus, (70hrs) 1. gather job descriptions, (requests - 50hrs) 2. apply NLP to this text, (nltk - 120hrs) 3. have a custom spacy model to decide if a given sentence constitutes as a requirement within the body of the text (40hrs - includes supervision), 4. return the results in a harmonized format, (pandas - 50hrs) 5. which I then present through a website.(flask/postgres/heroku/bootstrap - 120hrs) Have a look and let me know what you think. HN Special: I don't want to hoard this data and let it sit on some database. If it inspires, send across queries (sql or otherwise) you would like to run against this database. I would love to add them to a future version of BDDB. you can assume the below table for your mock queries. + requirements + location + seniority + title + date + salary + keywords relay email account for queries 8fi1pj5fb_at_mozmail_dot_com upwards and onwards! https://ift.tt/5LIdytN January 15, 2024 at 11:33PM

Komentar

Postingan populer dari blog ini

Show HN: Interactive exercises for GNU grep, sed and awk https://ift.tt/OxeFwah

Show HN: My Book Bulletproof TLS and PKI (Second Edition) Is Out https://ift.tt/5PZ9mxF