<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Career on Corey Daley</title><link>https://coreydaley.dev/categories/career/</link><description>Recent content in Career on Corey Daley</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Thu, 16 Apr 2026 18:35:00 -0400</lastBuildDate><atom:link href="https://coreydaley.dev/categories/career/rss.xml" rel="self" type="application/rss+xml"/><item><title>A 12-Month AI/ML Roadmap for Engineers Who Feel Behind</title><link>https://coreydaley.dev/posts/2026/04/12-month-ai-ml-learning-roadmap/</link><pubDate>Thu, 16 Apr 2026 18:35:00 -0400</pubDate><guid>https://coreydaley.dev/posts/2026/04/12-month-ai-ml-learning-roadmap/</guid><description>&lt;p&gt;Every senior engineer I know has a version of the same conversation with themselves: &amp;ldquo;I should really learn more about ML.&amp;rdquo; It comes up during a planning meeting when someone mentions embeddings. It comes up when a job description at an interesting company lists MLOps as a requirement. Then the sprint board calls it back.&lt;/p&gt;
&lt;p&gt;I&amp;rsquo;ve published a 12-month AI/ML learning roadmap designed specifically for experienced software engineers — not a beginner tutorial, but a structured path from ML foundations through LLMs and generative AI, ML engineering at scale, and a capstone that turns a year of steady work into visible career leverage. The core idea: AI/ML becomes career-changing when it compounds through one sustained body of work, not when it&amp;rsquo;s consumed as scattered content.&lt;/p&gt;
&lt;p&gt;If you&amp;rsquo;ve been sitting on the feeling that you should be doing something about this — most of the resources are free, the rest are investments worth making, and the plan is already written.&lt;/p&gt;
&lt;p&gt;Read more at &lt;a
 href="https://coreydaley.dev/posts/2026/04/12-month-ai-ml-learning-roadmap/" target="_blank" rel="noopener noreferrer"&gt;https://coreydaley.dev/posts/2026/04/12-month-ai-ml-learning-roadmap/&lt;/a&gt;
&lt;/p&gt;</description></item><item><title>The Rise of the Agent Wrangler</title><link>https://coreydaley.dev/posts/2026/03/the-rise-of-the-agent-wrangler/</link><pubDate>Tue, 10 Mar 2026 11:55:00 -0400</pubDate><guid>https://coreydaley.dev/posts/2026/03/the-rise-of-the-agent-wrangler/</guid><description>&lt;p&gt;People keep asking if AI is going to replace software engineers. Better question: who can still be trusted to ship production software when most implementation is delegated to agents? That role is the Agent Wrangler — and it isn&amp;rsquo;t a step down from engineering, it&amp;rsquo;s a different kind of engineering.&lt;/p&gt;
&lt;p&gt;You spend your day directing Claude Code, Codex, and similar tools through feature work, bug hunts, security audits, and codebase exploration. The job sounds easier than traditional engineering. It isn&amp;rsquo;t — at least not for the people who do it well. Because when you&amp;rsquo;re orchestrating agents, your technical depth is the control surface. CS fundamentals don&amp;rsquo;t disappear; they become the language you use to catch when an agent is wrong.&lt;/p&gt;
&lt;p&gt;Software engineers aren&amp;rsquo;t going away. They need to adapt — like they always have. Maybe the real new title is &amp;lsquo;Adaptability Engineer.&amp;rsquo; Are you ready to stop coding and start wrangling?&lt;/p&gt;
&lt;p&gt;Read more at &lt;a
 href="https://coreydaley.dev/posts/2026/03/the-rise-of-the-agent-wrangler/" target="_blank" rel="noopener noreferrer"&gt;https://coreydaley.dev/posts/2026/03/the-rise-of-the-agent-wrangler/&lt;/a&gt;
&lt;/p&gt;</description></item><item><title>Making Your AI Subscriptions Pay for Themselves</title><link>https://coreydaley.dev/posts/2026/03/making-ai-subscriptions-pay-for-themselves/</link><pubDate>Sat, 07 Mar 2026 14:00:00 -0500</pubDate><guid>https://coreydaley.dev/posts/2026/03/making-ai-subscriptions-pay-for-themselves/</guid><description>&lt;p&gt;A coworker and I were debriefing after an AI Bootcamp when I said the quiet part out loud: &amp;lsquo;I need my AI subscriptions to pay for themselves.&amp;rsquo; Add up Claude Pro, ChatGPT Plus, GitHub Copilot, Cursor, and a research tool, and you&amp;rsquo;re looking at $100+ a month just to stay current.&lt;/p&gt;
&lt;p&gt;The mental shift that changes everything: stop running your AI stack like subscriptions and start running it like equipment. Every tool needs a job. Assign each one to a revenue output, pick one small experiment, and ship something real in two weeks.&lt;/p&gt;
&lt;p&gt;You don&amp;rsquo;t need a hit app — you need $112/month and a closed loop. Are you running your AI tools in consumer mode or operator mode?&lt;/p&gt;
&lt;p&gt;Read more at &lt;a
 href="https://coreydaley.dev/posts/2026/03/making-ai-subscriptions-pay-for-themselves/" target="_blank" rel="noopener noreferrer"&gt;https://coreydaley.dev/posts/2026/03/making-ai-subscriptions-pay-for-themselves/&lt;/a&gt;
&lt;/p&gt;</description></item><item><title>Beyond LeetCode: How AI is Transforming Technical Interviews</title><link>https://coreydaley.dev/posts/2026/02/beyond-leetcode-how-ai-is-transforming-technical-interviews/</link><pubDate>Sun, 15 Feb 2026 17:46:25 -0500</pubDate><guid>https://coreydaley.dev/posts/2026/02/beyond-leetcode-how-ai-is-transforming-technical-interviews/</guid><description>&lt;p&gt;The coding interview landscape is shifting dramatically as AI tools become standard in software development. Rather than memorizing algorithms, candidates may soon demonstrate their ability to work effectively with AI agents—breaking down projects, creating tasks in Jira, and collaborating to solve real-world problems.&lt;/p&gt;
&lt;p&gt;This shift raises important questions about what skills truly matter and how we evaluate engineering talent in an AI-augmented world.&lt;/p&gt;
&lt;p&gt;What does this mean for the future of technical hiring?&lt;/p&gt;
&lt;p&gt;Read more at &lt;a
 href="https://coreydaley.dev/posts/2026/02/beyond-leetcode-how-ai-is-transforming-technical-interviews/" target="_blank" rel="noopener noreferrer"&gt;https://coreydaley.dev/posts/2026/02/beyond-leetcode-how-ai-is-transforming-technical-interviews/&lt;/a&gt;
&lt;/p&gt;</description></item><item><title>The Ethics of AI-Generated Code in Open Source: A Balanced Perspective</title><link>https://coreydaley.dev/posts/2026/02/ethics-of-ai-generated-code-in-open-source/</link><pubDate>Fri, 13 Feb 2026 11:57:51 -0500</pubDate><guid>https://coreydaley.dev/posts/2026/02/ethics-of-ai-generated-code-in-open-source/</guid><description>&lt;p&gt;Here&amp;rsquo;s a question that&amp;rsquo;s been keeping me up at night: When does using AI coding assistants cross the line from productivity tool to ethical problem? I&amp;rsquo;ve been using tools like GitHub Copilot and Claude Code extensively, and I started wondering—if someone submits AI-generated code to open source projects and builds their reputation on it, is that fundamentally different from using Stack Overflow or IDE autocomplete?&lt;/p&gt;
&lt;p&gt;In my latest blog post, I explore both sides of this debate. On one hand, AI democratizes contributions and amplifies what we can accomplish. On the other, it raises serious questions about authenticity, trust, and what it means to truly &amp;lsquo;know&amp;rsquo; the code you&amp;rsquo;re responsible for. The middle ground is messy and context-dependent.&lt;/p&gt;
&lt;p&gt;Where do you draw the line? Should contributors be required to disclose AI usage? What do you think?&lt;/p&gt;
&lt;p&gt;Read more at &lt;a
 href="https://coreydaley.dev/posts/2026/02/ethics-of-ai-generated-code-in-open-source/" target="_blank" rel="noopener noreferrer"&gt;https://coreydaley.dev/posts/2026/02/ethics-of-ai-generated-code-in-open-source/&lt;/a&gt;
&lt;/p&gt;</description></item><item><title>Why AI Interviews Can Feel Fairer (And What Humans Still Do Better)</title><link>https://coreydaley.dev/posts/2026/02/ai-interviewer-fairness/</link><pubDate>Sat, 07 Feb 2026 17:31:33 -0500</pubDate><guid>https://coreydaley.dev/posts/2026/02/ai-interviewer-fairness/</guid><description>&lt;p&gt;We&amp;rsquo;ve all been there: two candidates, two different interviewers, completely different experiences. One gets warm small talk and easy questions. Another faces a colder room and tougher grilling. The outcome often depends on who you got and what kind of day they were having. That&amp;rsquo;s where AI interviewers start to feel like a real upgrade.&lt;/p&gt;
&lt;p&gt;AI brings consistency—same questions, same order, every time. It reduces unconscious bias by sticking to job-relevant criteria and avoiding the silent &amp;lsquo;do I like this person&amp;rsquo; filter that favors similar backgrounds. Mood doesn&amp;rsquo;t swing the outcome because AI doesn&amp;rsquo;t have bad days. But humans still excel at reading body language, adapting on the fly, and making nuanced judgment calls.&lt;/p&gt;
&lt;p&gt;The future probably isn&amp;rsquo;t AI replacing humans—it&amp;rsquo;s AI handling structure while humans bring empathy. What&amp;rsquo;s your take on AI interviews? Would you prefer them over traditional ones?&lt;/p&gt;
&lt;p&gt;Read more at &lt;a
 href="https://coreydaley.dev/posts/2026/02/ai-interviewer-fairness/" target="_blank" rel="noopener noreferrer"&gt;https://coreydaley.dev/posts/2026/02/ai-interviewer-fairness/&lt;/a&gt;
&lt;/p&gt;</description></item></channel></rss>