Back
other Nov 26, 2024

How I Would Go About Learning an Arbitrary Subject Where No Full-Fledged Adaptive Learning System is Available

I’m using an LLM to learn biology. My overall conclusion is that IF you could learn successfully, long-term, by self-studying textbooks on your own, and the only thing keeping you from learning a new subject is a slight lack of time, THEN you can probably use LLM prompting to speed up that process a bit, which can help you pull the trigger on learning some stuff you previously didn’t have time for. BUT the vast, vast majority of people are going to need a full-fledged learning system. And even for that miniscule portion of people for whom the “IF” applies… whatever the efficiency gain of LLM prompting over standard textbooks, there’s an even bigger efficiency gain of full-fledged learning system over LLM prompting.

by Justin Skycak (@justinskycak) justinmath.com 3,799 words
View original

I’m using an LLM to learn biology. My overall conclusion is that IF you could learn successfully, long-term, by self-studying textbooks on your own, and the only thing keeping you from learning a new subject is a slight lack of time, THEN you can probably use LLM prompting to speed up that process a bit, which can help you pull the trigger on learning some stuff you previously didn’t have time for. BUT the vast, vast majority of people are going to need a full-fledged learning system. And even for that miniscule portion of people for whom the “IF” applies… whatever the efficiency gain of LLM prompting over standard textbooks, there’s an even bigger efficiency gain of full-fledged learning system over LLM prompting.

Want to get notified about new posts? Join the mailing list and follow on X/Twitter.


On a recent podcast, Zander asked how I’d go about learning an arbitrary new subject.

I get this question a lot but it’s kind of a hard question so I hadn’t really attempted to answer it until now.

I focused on the idea that – even if you have a good understanding of how learning happens and what kinds of practice techniques are effective – the hard part of the problem, that’s less in your control, is finding a high-quality curriculum.

Ideally an adaptive learning system that is both rigorous and comprehensive – and in most subjects that doesn’t seem exist yet. Yes, there’s Math Academy… but what if you want to learn, say, biology? Where do you go?

I actually looked into this last year because I want to shore up my own biology knowledge (so I can have more in-depth conversations with my wife, who’s a graduate student in immunology / bioinformatics). And I couldn’t really find anything besides various textbooks and lecture-based courses.

Now don’t get me wrong, it’s possible to teach yourself from textbooks – in fact, that’s how I learned most of the math I know – but man is it brutally inefficient compared to a good adaptive learning system. And that inefficiency is really what’s prevented me from pulling the trigger on learning biology.

So on the podcast we got to talking about the idea of trying an LLM to emulate an adaptive learning system.

The idea being that – while the emulated system is probably not going to be as good as a system built by a company whose sole job is to build that system – maybe if you prompt it with a lot of background knowledge on how efficient learning works, and correct erroneous pedagogical decisions whenever needed, maybe you can end up with a makeshift learning system that’s at least more efficient than standard textbooks.

Efficient enough to actually get me to pull the trigger on learning biology.

Session #1

I threw GPT-4o a PDF of the Math Academy Way along with the following prompt:

The results were actually pretty good. Not amazing, but good enough that I felt like the learning experience was efficient enough to get me to pull the trigger on learning biology for 20-30 minutes most days of the week.

(Zander recommended that I use Claude, and I know that’s the going advice here on X/Twitter, but Claude said the Math Academy Way pdf was 6x too big and asked me to split it into 6 separate files, so I just said screw it I’m using ChatGPT. May switch over to try out Claude once I build up my biology learning habit a bit.)

Even after providing the main prompt along with the Math Academy Way pdf, I still had to correct ChatGPT on some pedagogical issues and get it on the rails before it started working well enough for me to start actually learning.

The first challenge was that, when it started out asking me some diagnostic questions, it was asking me things that seemed like essay prompts. Here’s the corrective feedback I gave:

That seemed to fix the question type issue well enough to move on, but the next issue was that even after it said it was done with the diagnostic phase, it just kept asking me questions without presenting any new material to learn from. I had to explicitly ask it to provide me with instructional content:

At this point, it started providing brief instructional material followed by questions, and things started feeling right.

There was a minor snag with true/false questions that should have been fill-in-the-blank for retrieval practice, which I fixed as follows:

And then going forward I also included supplemental feedback in my fill-in-the-blank response if I thought I was missing additional prerequisite knowledge:

That seemed to work pretty well.

In the full session of about 25-30 minutes, I answered 18 questions. I probably spent about 10 of those minutes providing prompts and corrective feedback to keep ChatGPT on the rails pedagogically, but I’m anticipating there will be less of that in the future. So, I’m expecting to get through about 1 question per minute in the future… not bad!

I copied the entire conversation into a text file so that next time, I can supply that file (and the Math Academy Way) and ask it to pick up where it left off.

Overall, I would say the experience feels significantly more efficient than learning from a textbook – definitely not as efficient as a true adaptive learning system, but efficient enough to get me over the hump of starting to shore up my biology foundations.

That said – so far, this is all material that I was at one point familiar with (at least enough to pass an intro-level university biology course), so it remains to be seen how well things work out when I move from refreshing forgotten knowledge to learning completely new material. Learning completely new material is the true test of pedagogy; refreshing forgotten knowledge is more robust to pedagogical defects.

image

Session #2

Threw it the MA Way again, as well as a transcript of the previous session, and asked it to provide another session including some spaced review as well.

Things got off to a rocky start:

  1. it tried to give me spaced review on something it never actually tested me on, and
  2. it started down an inefficient learning path in an attempt to tailor to the session to my (incorrectly) perceived interests.

I corrected these issues with the following prompt:

It then started asking me how I would like to go about the session, jumping directly into the weeds on new content or starting with a high-level overview. Which on the surface might sound like a nice question – but remember, I’m a novice and I just want to learn this stuff as efficiently as possible. I shouldn’t be making these pedagogical decisions – that’s the whole point of the tutor. I conveyed this with the following prompt:

After that, there was another pedagogical issue to correct:

And then I came across another knowledge gap that I had to explicitly ask it to backfill for me, and I had to remind it about a pedagogical issue I corrected last time:

I also started getting fed up with its verbosity, which I fixed as follows:

Overall, I didn’t make a whole lot of progress today. I spent most of the time trying to get ChatGPT on the rails.

To improve the situation for next time, what I’m going to do is compile all of the feedback I’ve given ChatGPT and put that directly in the main prompt. All this stuff will of course be in the tutoring log PDF, but I probably need to make the really important feedback front-and-center.

image

Session #3

Massive improvements in round 3! Finally settled into a rhythm where the process will work for me long-term.

The game-changing trick was – in addition to feeding it the MA Way and then a log of previous sessions – also reminding it of the particular critical feedback I had given it in previous sessions. (There was a bit more fine-tuning today but it was quick and minor.)

Granted, the efficiency is nowhere near MA-level, and even then it requires an experienced driver behind the wheel… but man is this more efficient than any other biology learning resource I’ve found. The friction has been reduced enough that learning biology finally feels realistic.

Here’s a dump of all the critical feedback that I’ve given it so far:

image

Conclusions

I should point out that this is kind of a “magic demo” where it might look cool/promising but there’s a subtle trick that I’m using to make the situation WAY easier in my particular use-case than it is in general reality.

The trick, the secret sauce, is that I’m continually leveraging my own expertise on efficient learning. I’m continually prompting nudges and recalibrations to keep the chat on the rails. The moment I start behaving more like a typical student, or even a serious student who is not an expert on efficient learning, things go completely off the rails.

It’s not always instantaneous. But it’s like there’s a compounding misalignment that, without continual nudges and recalibrations, eventually reaches a critical level and drives you off the road into a ditch.

That’s really the hardest part of building an effective learning system – keeping students on the road. Most people do not have a good enough understanding of what effective training entails to oversee/manage the process themself. Left to their own devices they will typically derail the process if they’re given enough agency to do so. Even if they don’t mean to.

Overall my conclusion is that

BUT the “IF” here applies to a miniscule portion of people. The vast, vast majority of people are going to need a full-fledged learning system.

And even for that miniscule portion of people for whom the “IF” applies… whatever the efficiency gain of LLM prompting over standard textbooks, there’s an even bigger efficiency gain of full-fledged learning system over LLM prompting.


Want to get notified about new posts? Join the mailing list and follow on X/Twitter.