58 comments

  • gignico 9 hours ago ago

    My two cents as a university teacher:

    In my view AI tools are a sort of super-advanced interactive documentation. You can learn factual information (excluding allucinations) by either asking or looking at the generated code and explanations of it. But in the same way documentation alone was not a sufficient learning tool before, AI is not now.

    What AI cannot give you and I suggest you to learn through other resources:

    - algorithmic proficiency, i.e. how to decompose your problems into smaller parts and compose a solution. You don’t necessarily need a full algorithms course (even though you find good ones online for free) but familiarising with at least some classical non-trivial algorithm (e.g. sorting or graph-related ones) is mind-changing.

    - high-level design and architecture, i.e. how to design abstractions and use them to obtain a maintainable codebase when size grows. Here the best way is to look at the code of established codebases in your preferred programming language. A good writer is an avid reader. A good programmer reads a lot of other people’s code.

    - how programming languages work, i.e. the different paradigms and way of thinking about programming. This lets you avoid fixing on a single one and lets you pick the right tool for each task. I suggest learning both strongly-typed and dynamic languages, to get the feeling of their pros and cons.

    That’s an incomplete list from the top of my mind.

    You can still use AI as a tool in learning these things, but good old books and online resources (like Coursera) worked really well for decades and are not obsolete at all.

    And the last thing is the most important: curiosity about how things work and about how to make them better!

  • jmathai 10 hours ago ago

    I learned all of my programming outside of university and textbooks. It’s one way to learn. Not the only way though - and it has its limits - but you can get pretty far.

    But here is my advice. Learning by doing with AI seems akin to copying source from one location (I.e. view source, stackoverflow).

    My tips:

    - Understand all of the code in a commit before committing it (per feature/bug).

    - Learn by asking AI for other ways or patterns to accomplish the something it suggests.

    - Ask Claude Code to explain the code until you understand it.

    - If code looks complex, ask if it can be simplified. Then ask why the simple solution is better.

    - Tell AI that you’d like to use OOP, functional programming, etc.

    One way to measure if you’re learning is to pay attention to how often you accept AI’s first suggestion versus how many times you steer it in a different direction,

    It’s really endless if your mindset is to build AND learn. I don’t think you need to worry about it based on the fact you’re here asking this question.

    • politelemon 10 hours ago ago

      I have found I'm always having to steer it in the right direction. I will think I've given it the right amount of instructions but it tends to do dumb things in ways I haven't anticipated.

    • coffeefirst 8 hours ago ago

      Good stuff, and I’d add one more trick from the old Zed Shaw books: if you want to learn something new, type it out yourself. Can you copy paste? Can you make the robot do it? Yes, but going through the motion helps embed it in your brain.

      Once it’s deep in your memory, you can take all the shortcuts you want, but now it’s for speed instead of necessity.

    • codetiger 10 hours ago ago

      Came here to type something similar and saw this comment. +1

      Just repeat this until you understand a language's unique ways of implementing things, and understand why a language has those choices compared to others. I always pick one of these experiments to learn a new language with/out LLM support. 1. Ray tracing 2. Gameboy Emulator 3. Expression evaluation (JSONLogic or Regex)

      These are super easy to implement in 100s of lines of code, however if you want to optimize or perfect the implementation, it takes forever and you need to know a language's nuances to get it better. Focus on performance tuning these implementations and see how far you can go.

  • bcrosby95 9 hours ago ago

    I think it's nearly impossible to "learn" to the same depth when someone else writes the code: it doesn't matter if its your teacher, friend, coworker, or AI writing the code. There absolutely is a difference between having to toil to come up with an answer, fail, fail some more, work through design flaws, then eventually come up with the right answer. You learn a lot in the process.

    Versus someone or something giving you the, or even several, correct answers and you picking one. You are given what works. But you don't know why, and you don't know what doesn't work.

    Learning from AI coding probably is somewhere between traditional coding and just reading about coding. I'm not sure which one it's closer to though.

    However, it may not be necessary to learn to that depth now that AI coding is here. I don't really know.

  • cladopa 8 hours ago ago

    As someone who learned programming the hard way I hated it even when I am really good at it. It got on the way of me doing useful products and services that I wanted to create.

    Because of that I learned Lisp so I could do metaprogramming, because manually it would take multiple humans lives to be able to create what I want before I die, even (or specially) controlling a small group of people.

    We use Claude code and personally I love it. It is like the electric sawmill instead of humans cutting manually, sweating and being exhausted after half an hour of work.

    After decades programming I know how to tell the AI what I want, and how to control/log/assert/test/atomize everything so it behaves.

    You can use AI to teach you programming, the problem is that you need to tell her what you want, and if you are not experience you don't.

    So do small projects and let the AI do 80% of the work, spend the remainder 20% finishing by hand. Usually LLMs are amazing for approaching valid solutions but are really bad at making something perfect. They fix something and destroy something else. You do that manually.

    But "learning programming" in abstract is like "learning to drill", why do you want to drill? What do you want to drill? Where do you want to drill?

    You need to make it specific into specific projects, with specific timelines. "I want to play the piano" is abstract. "I want to play this version of Feather theme in 3 months" is specific.

  • jmward01 10 hours ago ago

    From your description it sounds like you have the most important stuff: variety, a willingness to do things and a willingness to seek advice. I doubt anyone on HN really knows what it takes to learn to be a coder in the new vibe world. It is really too soon to have seen people 'grow up' and the paths that lead to success or not. In general though if you want to learn something you need to do stuff related to what you want to learn, you need to do stuff in many different ways and you need to ask others what they are doing and have done to see if their paths can help you. Keep doing those things and you will likely be fine. The only other advice I can give you you probably already know, find a passion project. For me it was (initially) fractals. Then it was a thousand other things. One passion project will get you through a lot of learning.

  • yzjumper 10 hours ago ago

    Just don’t use an LLM for learning for doing projects at first. I only use it for things I already know how to do or for research. I treat it like a teenage intern.

    • bionade24 9 hours ago ago

      > I only use it [..] for research.

      And it can be pretty great for that. But I'm not sure if this works well for people who don't have experience reading API documentation or coding support sites like Stackoverflow. Beginners having a problem most likely don't know any abstract term for the problem they want to solve, so they'll feed their scenario meticulously to the LLM, causing it to output an already tailored solution which obfuscates the core logical solution.

      It's like starting to learn how to code by reading advanced code of a sophisticated project.

  • fpauser 8 hours ago ago

    1. https://exercism.org/

    2. disable copilote

    3. only "talk" about concepts and patterns with AI

    • ipnon 8 hours ago ago

      I agree, use the AI to replace all of your typing, but none of your thinking.

  • mattikl 9 hours ago ago

    Don't be afraid to go deep in simple sounding topics. The modern world is so full of learning material that there's the temptation to ingest as much of it as possible, but true learning happens when you give yourself time with one topic at a time. And I'd say this is more important than even, because generative AI is becoming great at precisely generating things, not so in understanding complex topics.

    That said, learning the fundamental topics can limit your thinking first if they feel difficult, so it's an interesting question how to keep the naïve creativity of the beginner that's something that can really help when building with AI because there are less limitations in your thinking based on how things used to be.

  • noam_k 7 hours ago ago

    I'd like to draw a parallel to carpentry:

    A carpenter uses tools to shape wood into furniture. Each tool in the toolbox has different uses, but some are more efficient than others. For example, a table saw lets the carpenter cut more quickly and accurately than a hand saw. Nobody would say "that's not a real carpenter, he cheats by using a table saw".

    A carpenter can also have an assistant (and I'm specifically not talking about an apprentice) who can help with certain tasks. The assistant might be trained by someone else and know how to perform complex tasks. When the carpenter builds something with the assistants help, is that considered a team effort? Does the carpenter need to take responsibility for the assistants mistakes, or the trainer? Who gets credit for the work?

    I don't have answers for these questions, but I think the parallel to software is straightforward: we have a new tool (assistant) that's available, and we're trying to use it effectively. Perhaps it's going to replace some of our older tools, and that's a good thing! Some of us will be lazy and offload everything to it, and that's bad.

    I do think that learning the fundamentals is as necessary as ever, and AI is a great tool for that as well.

    (Disclaimer: I've been programming for about 15 years, and haven't integrated AI into my workflow yet.)

  • appsoftware 8 hours ago ago

    We're all ultimately just learning what we need to to get the job done. After 20 years programming, it is very clear that nobody knows everything. Everyone just knows their own little slice of the software world, and even then you have to 'use it or loose it'. If you're feeling imposter syndrome, keep a study side project going where you don't use any AI, something like NAND to Tetris that forces you to learn low level concepts, and then just stay productive using AI for the rest of your work.

  • zeroonetwothree 10 hours ago ago

    If you can’t code without an AI then you don’t really know how to code. It’s important to learn skills manually before automating them.

    • journal 5 hours ago ago

      If you can't do math without a calculator then you really don't know math, but when does that matter?

      • 1718627440 an hour ago ago

        When you want to do function discussions or learn calculus.

  • gorbachev 2 hours ago ago

    You're only one year and change into it. It takes longer than that for most people to become anywhere near proficient at it.

  • 1jreuben1 6 hours ago ago

    You never stop learning to code. Experimenting with LeetCode and ClaudeCode - revising the landscape of DS&A (again) enough to steer the which, when and what, and validate the how. Eg: I know what a SkipList does, when to use and fuzzily how it works. The hardest thing about leetcode: 1) knowing you will forget it 2) knowing you will never use it much 3) knowing you are forgoing time to learn other stuff 4) knowing that AI makes knowing it redundant

  • delis-thumbs-7e 9 hours ago ago

    I don’t try to ship quickly. I started learning programming 2024, I’d say I’m pretty good with Python, proficient in vanilla web tech, ok with C, and I know basics of React/Fullstack. Starting from nothing I’d say I have progressed very fast, I follow a uni CS course. LLM’s have certainly helped in explaining concepts and to learn, but I don’t use them to code pretty much at all.

    I recognised that my weaknesses are more in understanding the mathematical fundamentals of computation, so now I’m mostly studying maths rather than coding, currently linear algebra and propability theory. Coding is the easy part I’d say. Hopefully I get to concentrate on the study of my sworn enemy, algorithms, at some point.

    I’d like to be able to do low-code and graphics/sound -programming some day. Or maybe use that knowledge some other cool stuff, if we are all replaced by robots anyway.

  • ILoveHorses 9 hours ago ago

    One can use AI to lead you to better sources. The issue I face is, whenever I search something I want to understand in a search engine, the first 10 links are always low-quality SEO links, or surface level AI generated tutorials. There is a treasure of high-quality blogs, books, interactive tutorials out there which don't show up when you search for it. For example, if you wanted to learn socket programming, you'd be better off following Beej's guide to socket programming instead of 100 g4g pages. Similarly, for Bash, you'd actually understand how every word you write works instead of just memorizing 20 commands if you followed TLDP's book or lhunath's guide. How do you find these resources? Use Perplexity or Reddit's AI to search for high-quality resources.

  • micaeked 10 hours ago ago

    I recommend zachtronics games. I wouldn't go as far as to claim direct knowledge or skill transfer to "real" programming, but it sure feels like it's exercising the metaphorical muscles in a very different way.

    Side note, I'm assuming you find joy in programming. If you don't, there's better ways to spend your time.

  • hknlof1 8 hours ago ago

    I truly believe, that even before Vibe Coding the amount of abstractions one is developing against has been in the way of learning programming and feeling good about it.

    You do React + Redux or any other framework and feel like a lot of decision have been made for you without grasping the actual reasoning for these decisions.

    The best learners I have encountered and for a year, I am trying to implement:

    Learn the platform you develop for on a side project. You develop for the web and more on the programming side: Learn JS for Web and HTML. You will encounter state management, animations, events, DOM manipulation etc.. Solve them without libraries, first.

  • jgammell 4 hours ago ago

    My university made us learn to code 'close to the metal' and IMO this is a great way to gain an understanding of what is actually going on. Program in C, no IDE, no AI tools.

    The AI tools are incredibly helpful (and people who say otherwise are disengenuous), but if you don't already roughly know how you want to implement something and you let AI take the wheel, you aren't going to learn anything. From a learning standpoint I feel like the best approach is to plan and write your code without using AI at all, then maybe use it as a critic to give feedback on what you've done.

  • shinycode 9 hours ago ago

    Drop AI, open a basic editor and write everything by hand without asking anything to AI. Do searches by yourself. That’s how world worked for decades pre 2022. Debug by your own, without asking anything to AI as well.

    • mrweasel 8 hours ago ago

      AI has changed nothing in terms of learning to program, it's every bit as complicated as it ever were (well languages are better now, compared to the 1960s, but still hard).

      Becoming an expert takes years, if not decades. If someone has only started programming in 2025, then they still have a long way to go. I get that seeing others move fast with AI can be discouraging, and the only advise I can give is "ignore them". In fact, ignore everyone attempting to push LLMs upon you. If your learning to program, you're not really ready for AI assisted coding, wait ten years.

      There's no really satisfying answer other than: Keep at it, you're probably doing better than you think, but it will take years.

      • jraph 7 hours ago ago

        > AI has changed nothing in terms of learning to program

        In terms of what you should be doing when you learn to program, I fully agree.

        In terms of the effects AI has on the activity of learning to program, I think it has: it has made it very tempting (and affordable - so far) to just make the AI build and even adapt the simple stuff for you that you'd otherwise be building and adapting by yourself. I suppose it can even give you the false feeling you understand the stuff it has built for you by reading the generated code. But this makes you never go through the critical learning steps (trial and error, hard thinking about the thing, notice what you are missing, etc).

        We already had the possibility to run web searches and copy paste publicly available stuff, but I think that this came with more friction, and the automated adaptation aspect was not there, you've had to do it by yourself. I think Gen AI has made it way easier to be lazy in the learning and it's a trap.

        But from the rest of your comment it seems we mostly agree.

    • jraph 9 hours ago ago

      +1

      If you really can't drop the AI, ask it stuff when you are really blocked, but ask it not to provide code (you need to write it to understand and learn), but I suspect you'd be better served by a regular web search and by reading tutorials written by human beings who crafted and optimized the writting for pedagogy.

      It will probably feel slow and tedious, but that's actually a good, mpre efficient use of your time.

      At this point of your journey, where your goal is above all to learn, I doubt the AI works in your interest. It's already unclear it provides long term productivity boost to people who are a bit more experienced but still need to improve their craft.

      You don't need to optimize the time it takes to build something. You are the one to "optimize".

  • foota 8 hours ago ago

    Imho (in my harsh opinion, in this case) unless you're a prodigy you're probably not a very good programmer at this point. Now granted people learn more quickly outside of university I think because it tends to be more focused, but I don't think I know anyone that I would have called a good programmer a year in.

    But feel free to call yourself a programmer, I'm not going to gatekeep it :)

  • keepamovin 10 hours ago ago

    Is this like learning calligraphy in the typesetting era?

    Before the AI era, I didn’t know much bash, but I was a reasonably OK programmer besides that I think. I found by getting AI to write me a lot of bash scripts and following along and then making edits myself when I needed small things changed I ended up with the ability to write bash now, and actually kind of appreciated as a language where as before I thought it was confusing. YMMV

    Like anything with enough dedication you can achieve what you want.

    • Antibabelic 9 hours ago ago

      This is a strange analogy, because learning calligraphy is essential for any type designer worth their salt. Read The Stroke by Gerrit Noordzij.

      • keepamovin 9 hours ago ago

        I don’t mean type designer I mean, the Gutenberg press. Before mechanical printing books were copied by monks using calligraphy weren’t they?

        • jraph 8 hours ago ago

          It's not exactly the same thing.

          When the Gutenberg press exists, knowing how to copy whole books by hand is 0% useful anymore, including to run a book copy using a press. There's also virtually no advantage to hand copy a book when you have a press.

          You still need to know how to program to build something and maintain it in the long run. You need to be able to understand the Gen AI's output or you are in for some trouble, and to deeply understand the Gen AI's output you need to have practiced programming. What's more, you need to have practiced not only (generic) programming, but the specific stuff you are working on (the domain, the specific technologies, the specific codebase).

          • keepamovin 8 hours ago ago

            It was a little bit of a humorous tease however, I think there’s a side to it you’re missing as valid is what you say right now is.

            • jraph 7 hours ago ago

              > It was a little bit of a humorous tease

              Whoops, missed that, sorry for this!

              Not sure I understand the rest of your sentence, I understand that you are saying what I'm saying is only valid right now but could change as Gen AI keep improving.

              I personally think this stuff significantly improving will require a breakthrough / paradigm shift, and that the "sophisticated stochastic parrot" model, even with "reasoning" stuff "patched" on top might only go so far and might quickly plateau (this is not science, only mostly uninformed opinion though).

              • keepamovin 4 hours ago ago

                Hey bud I'm with you there on the next gen breakthroughs requiring more than the current models + reasoning, tho they do take it pretty far. Re the sentence: s/as valid is/as valid as/, but yeah you got me even with the error!

                I think truly next gen requires embodiment so systems can learn emotions and consequences, plus reason from their own perspective. I also think the NLP processing can be radically simplified to make training/inference way lower cost. There's also probably another layer we haven't grokked yet, maybe something like NLP/transformers on abstract non linguistic symbolic reasoning, that emerges from linguistics and world models/embodied experience, to truly refine this to the ideal of intellect we are seeking. That should open the gate to AGI, tho there's probably some other magic x-factor step to take a perfectly intelligent individuated synthetic consciousness (in a robot body) to whatever we want from AGI tho. Idk, what do you think? :)

  • everfrustrated 9 hours ago ago

    So long as you can prompt your AI to successfully debug your way out of problems - you don't need to understand code.

    I appreciate this will be a deeply controversial statement here. As someone who's been coding for 25+ years and has some part of my identity in my ability to code this hurts and wounds me, but it is sadly true. The skills I've built and honed have little value in this new market. This must be how musicians felt when radio, records etc came about. My craft has been commoditized and it turns out nobody cared about the craft. They are happy listening to canned music in restaurants. Musicians are now like zoo animals where people pay an entry fee to see them for the novelty value. I exaggerate to illustrate the shift but part of me fears this might be more analogous than I dare to understand.

    Code is about providing value to a business not in the lines of code themselves. Code is a means to an end.

    If you want to understand coding for your own intellectual and hobbyist pursuit then please do. Generations of autistic-leaning people have found satisfaction doing so - but don't do it thinking it will remain a rewarding career.

    • input_sh 9 hours ago ago

      So long as you can navigate someone from the passenger seat, you don't need to know how to drive a car. I've been an experienced driver myself, but in the age of self-driving cars it's just not a useful skillset to have. A car is just a means to an end, why learn how to drive it when you can simply hop into a taxi?

      The answer: because people find joy in doing it themselves.

      • jraph 7 hours ago ago

        > why learn how to drive it when you can simply hop into a taxi?

        Because hopping into a taxi is kinda expensive, most can't do that daily.

        > in the age of self-driving cars it's just not a useful skillset to have

        Self-driving cars are not there yet, especially as there are somewhat unpredictable human beings still driving around and imperfect infra. Laws are also not really there yet around the world too.

        Self-driving is also kinda a black box that you don't really have control on, especially as long as these cars are connected to the mother company.

        In a way, most of that is mostly true for programming and Gen AI as well (and Gen AI might become expensive as well), so your analogy might be quite apt in the end xD

        Otherwise,

        > because people find joy in doing it themselves

        Many people seem to enjoy it indeed. I'd be perfectly happy delegating driving. I can like driving, but I don't enjoy the responsibility and the risk that I mess something up.

        I do enjoy programming myself though :-)

    • loveparade 9 hours ago ago

      Okay, but most of the time you can't prompt your AI to successfully debug you out of problems if you don't understand code. Or when you do the AI will solve the problem in a way that creates a dozen more cascading problems an hour later. I've also been coding for 20 years now and I feel like my coding skills are just as important now as they were 10 years ago. Without them I'd never be able to use AI effectively.

      The only exception really are greenfield apps like "create a toy todo app demo" or "scaffold this react project" but that's like 0.001% of real world engineering work.

      • everfrustrated 9 hours ago ago

        True, but it very much depends on the domain and complexity of stack you're working in. For a lot of crud type dev work the problems are common to many and AI will have no trouble.

  • beej71 9 hours ago ago

    I'm an experienced dev, out of the industry now. I'm trying to level up in Rust, and here's what I do.

    I bust my ass getting software written by hand using The Book and the API reference. Then I paste it into an LLM and ask it to review it. I steal the bits I like. The struggle is where we learn, after all.

    I also bounce ideas off LLMs. I tell it of a few approaches I was considering and ask it to compare and contrast.

    And I ask it to teach me about concepts. I tell it what my conception is, and ask it to help me better understand it. I had a big back and forth about Rust's autoderef this morning. Very informative.

    I very, very rarely ask it to code things outright, preferring to have it send me to the API docs. Then I ask it more questions if I'm confused.

    When learning, I use LLMs a lot. I just try to do it to maximize my knowledge gain instead of maximizing output.

    I'm of the belief that LLMs are multipliers of skill. If your base skill is zero, well, the product isn't great. But if you possess skill level 100, then you can really cook.

    Put more bluntly, a person with excellent base coding skills and great LLM skills with always outperform, significantly, someone with low base coding skills and great LLM skills.

    If I were writing code for a living, I'd have it generate code for me like crazy. But I'd direct it architecturally and I'd use my skills to verify correctness. But when learning something, I think it's better to use it differently.

    IMHO. :)

  • block_dagger 9 hours ago ago

    Talk the LLM and have it explain code. Write very small examples by hand and make sure you understand how they work. Big software is just a bunch of those small things working together.

    • mc3301 7 hours ago ago

      Fully self-taught, my first time ever working with someone on real-world code, I was interning for a open-source python CRM. The owner said to me, "anytime some code is difficult, just break it into smaller pieces. If it is still difficult, break it into smaller pieces."

      This has stuck with me since; it is indeed applicable to many facets of life.

  • austin-cheney 7 hours ago ago

    WTF. If you really feel that way then stop using AI. Either you are building confidence or you aren’t.

    Now, step back and do some serious introspection. First, look around. Odds are you are surrounded by imposters, irrespective of AI. Identify who the imposters are and isolate yourself from them. Secondly, solve hard problems. Confidence comes from learning. If you cannot perform without AI then your learning is low.

    At the end of the day you provide solutions to problems. If you cannot do that almost instantly in your mind, forming a vision, you are less valuable than someone who can.

  • 2muchtime 8 hours ago ago

    I’ve been coding as a hobby since about 2020. I’m not particularly great at it and I’m still trying to teach myself.

    I try to write whatever code, let’s say a function, by hand. It probably won’t work so I just have the LLM Socraticaly ask me questions about why it’s not working and then I try to fix it by hand and keep repeating this until the function works and does what I want.

    If there is some more difficult to understand concept I write a small article to try to explain it and have the LLM explain me though my trial and error of writing until my paragraph or essay or whatever amount of writing needed to explain what I’m trying to learn is complete.

  • ipnon 8 hours ago ago

    2 things have not changed since the advent of AI, and never will change. The first is the calculus of computation. The second is the engineering of computation. All coding is just some interesting combination of these two forms of logic.

    You can learn both of these quickly to a deep level with only 2 books. For the calculus of computation, read "Structure and Interpretation of Computer Programs" (SICP) by Abelman and Sussman. It is available for free.[0] If you understand all of this book, you understand all of the fundamentals of computer science. Every program you write from now on will be understandable to you, with enough persistence. But most importantly you will be able to think in computer programs by second nature, and communicate in this language. And when you talk to AIs in this language, they become exceedingly precise and powerful, because you lost the ambiguity in how you are conceptualizing the program.

    For the engineering of computation, read "The Elements of Computing Systems" by Nisan and Schocken.[1] An abridged version of this book is available for free in the form of the Nand2Tetris course. In this course you will start with an imminently simple digital construct, and use it to build step by step a full working computer that can run Tetris. You could even write a Lisp from SICP on this computer, and pretty easily too as you'll see in SICP itself! Once you have completed both books you'll have met in the middle between abstract computer science and concrete computer science: coding.

    Just like in your math class, you can see that one side of a right triangle is always longer than the others, but you cannot understand how or why or explain it or work with it until you can comprehend some simple theorems and functions, you cannot truly compose computer programs until you can speak the language of computer science. It used to be that you could make a career by copying code you saw online, patching bits and piece together to create basically working code. But that era is over. AI reads and writes and searches millions of times faster than you. But still only humans are capable of new compositions. But in order to create these new compositions you have to be able to speak a mutual language that you and the AI can understand. That language is computer science, and it hasn't changed since time began and it won't change in 10, 100 or 1,000 years from now when AI is capable of doing anything and everything better than we can. So if you want to stop struggling and start creating new and exciting things with computers, read these 2 books!

    [0] https://mitp-content-server.mit.edu/books/content/sectbyfn/b...

    [1] https://www.nand2tetris.org

  • globalnode 8 hours ago ago

    Use ai as a database to help you get up to speed with the domain you're working with - its really good at that!, then write your own code. If youre learning you can then submit that code to ai and ask it for feedback. Though take all feedback with a grain of salt, it may inexplicably start trying to redesign all your stuff with 10 levels of oo abstraction that no one would really write.

  • tamimio 9 hours ago ago

    I was talking about something similar with a friend before, I told him internet made people lazier, automation made people weaker, and AI apparently will make people more stupid.

    Probably in 10y from now, it will be a flex if someone is building or doing stuff without using AI, just like now if you are using a manual screwdriver instead of impact driver or actually going to the library to research a topic instead of googling it.

  • exodust 9 hours ago ago

    > "the optimal path is somewhere in between"

    I think this is the correct answer. Also we technically never stop learning. There's always some new coding trick that alluded us until AI spits it out.

    My 2 cents: Switch to chat mode from agent mode and have better chats about approaches to code. I'm constantly challenging AI to explain its code, including talking about pros and cons of this or that method, and even the history of why certain new features were brought to javascript for example. It's also fun to query the AI about performance optimisation, presuming we all want the least amount of cycles used for the given procedure.

  • tayo42 9 hours ago ago

    Imo you need to struggle to learn and make things click.

    I don't see why even with Ai you won't need to have a solid understanding of the parts of computing programing is built on top of.

    Even if your prompting, you need to know what to prompt for. How are you going to ask it to make it faster if you don't know it can be faster, or if you waste time on trying to make something faster that can't be?

    Go through something like cs classes from MIT and do the work.

  • incomingpain 2 hours ago ago

    I've been coding since about 2009.

    If someone were to ask me how to learn to code today. I dont know what I would say. My gut is telling me to say go into farming or marketing. vs ALL-IN. Down the rabbit hole you go.

    >I often see people say that the solution is to just fully learn to code without AI, (i.e, go "cold turkey"), which may be the best, but I wonder if the optimal path is somewhere in between given that AI is clearlly changing the game here in terms of what it means to be a programmer.

    You need to know the fundamentals. You probably still need to learn to code manually. Thonny is the best beginner IDE.

    You will want to switch over to AI pretty quick though. As a beginner, you probably want GUI, so Antigravity, Codex, or Claude Code.

    >I have shipped a few projects, I always review AI-suggested code, do daily coding practice without AI, watch youtube videos, etc. but still don't know if I'm striking the right balance or whether I can really call myself a programmer.

    There is no license. You can call yourself that whenever you please. Here's a relevant hacker news blog though: https://paulgraham.com/identity.html

    >I'm curious how you have all handled this balancing act in the past few years. More concretely, what strategies do you use to both be efficient and able to ship / move quickly while ensuring you are also taking the time to really process and understand and learn what you are doing?

    Once you really get your AI coding skill up and you know your model. You have the rocksolid architecture/design. You tend to be more concerned with the model deleting 1000 lines accidentally.

    But then openclaw bursts onto the scene and we dont need to know how to code anymore? Like I give it a task, ask if it has a skill, it says no. It goes to gemini cli, creates its own skill. It's now giving me updates on changes. I didnt do anything.

  • thewhitetulip 8 hours ago ago

    I learnt programming when Books were actually used, back when the docs page were barebone.

    My 2 cents: read the actual docs, these days docs are exceptional. Rustlang offers a full fledged book as part of their docs. Back when Go was launched and their docs wrre inadequate and I had started to write a short github based "book" for newbies, and it did well (looking at the github stars)

    Learn without AI, be an expert. And then use AI to write the code.

    Using AI to learn is honestly delusional. You don't learn when AI writes the code for you. Also for a new language it'll take some time for us yo get used to the syntax - hence writing by hand until you become an expert.

    The goal of writing software for your job is to write it within that sprint.

    But for hobby at least you can take time & learn

    Although I'd recommend to get into depth for whatever tools you are going to use at your job without AI because who knows, maybe your next company won't allow you to use AI!

  • marcus_holmes 9 hours ago ago

    I taught myself to code as a teenager back in the 80's on those early microcomputers (Commodore PET, Acorn Atom, BBC micro). Everything was much simpler, and easier to learn.

    A career in software development 30+ years later, and I'm back learning from day one again, because LLMs are profoundly changing how we do this.

    Example: two years ago, I built a website as an MVP to test a hypothesis about our customers. It took me 6 weeks, didn't look good, but worked and we used it to discover stuff about our customers. This week I've vibe-coded a much better version of that MVP in an afternoon. This is revolutionary for the industry.

    The state of the art on LLM coding is changing fast and by orders of magnitude. They still get things wrong and screw up, but a lot less than they did a year ago. I fully expect that in a couple of years [0] writing code by hand will be completely archaic.

    So, what does this mean for people learning to code?

    Firstly, that hand-rolling code will become artisanal, a hobby. Hand-coding a program will become like hand-carving a spoon; the point is not to produce the best spoon in the most efficient manner, but to create Art.

    Secondly, that commercial coding as a career will revolve around collecting business requirements, translating them into prompts, and orchestrating LLM code engines. This will be a cross between "Product Manager", "Project Manager", and "Solution Architect" in current role definitions.

    Thirdly, that at least for next few years, understanding how code actually works and how to read it will be an advantage in that commercial career space. And then it'll be a disadvantage, unnecessary and potentially distracting. Soft social skills will be the primary factor in career success for this profession in the future.

    The industry has been through similar changes before. Most obviously, the invention of compilers. Pre-compiler, programmers wrote machine code, and had to manage every single part of the operation of the computer themselves. Need a value from memory for an operation? You had to know where it was stored, clear a register to receive it, fetch it, and work out where to store the result. Post-compiler, the compiler managed all of that, and we were able to move to high-level languages where the actual operation of the computer was a couple of abstraction layers below where we're thinking. We no longer need to know the actual physical memory address of every value in our program. Or even manage memory allocation at all. The compiler does that.

    And yes, there was a generation of programmers who hated this, and considered it to be "not real programming". They said the compilers would write worse, less efficient, programs. And for years they were right.

    So, to answer your question:

    > AI is clearlly changing the game here in terms of what it means to be a programmer.

    > More concretely, what strategies do you use to both be efficient and able to ship / move quickly while ensuring you are also taking the time to really process and understand and learn what you are doing?

    Embrace the change. Learn to manage an LLM, and never touch the code. Just like you're not writing machine code - you're writing a high-level language and the compiler writes the machine code - the future is not going to be writing code yourself.

    Good luck with it :)

    [0] There are lots of questions around the finances and sustainability of the entire LLM industry. I'm assuming that nothing bad happens and the current momentum is maintained for those couple of years. That may not be the case.

    • delis-thumbs-7e 9 hours ago ago

      This is unbelievably depressing. Why to learn any skills at all, since some model is going to do it pretty soon and companies only need bunch of sales people, your Musk’s and Bezos’, to sell that crap. Rest of us rotten our brains online living on goverment hand-outs and designer drugs.

      • jbloggs777 7 hours ago ago

        Why learn to play the drums, when there are drum machines? Or play any music, when there are MP3s? Or cook, when there's microwave dinners?

        If you can't answer the above, you might want to have a chat with a psychologist. We can and do create meaning in our own lives.

        Programming will change, but I won't miss creating the same boilerplate again and again. I expect to focus more on translating the business & technical requirements to decent quality results. I expect good interfaces and separation of concerns will be even more important, as whole modules might be rewritten from scratch rather than being modified, changing the way we think about maintainable code.

  • Sirikon 8 hours ago ago

    The correct amount of AI for learning is zero.