School has two jobs. AI is forcing us to admit it.
AI in one hand, pencil in the other: proof of mind and proof of work
TL;DR: This isn’t my usual article; it is something I’ve been thinking about a lot as a parent with three kids entering the workforce in the next couple of years. I’m grappling with what school is truly for in an AI era, how it might evolve from high school to college to the first year on the job, and what we should expect from students, educators, and employers. I hope you find it interesting and insightful, and if it sparks ideas or pushback, I’d love to have a conversation.
My kids, two of whom are in college and one in high school, keep me honest about school. A few days ago, my son texted me a screenshot of a homework problem and asked, “Can I ask an AI to help with this?” A year ago, I would have said no and moved on. Now I pause. Because the question is bigger than a single assignment. It is about what school is for.
Here is how I see it. School has two jobs. First, build minds: attention, judgment, taste, grit, the ability to sit with a hard idea and not flinch. Second, build marketable skills: the capacity to do useful work with real tools, on real teams, on real deadlines. We have treated those as the same thing. They are not. AI did not create the split; it just pulled the seam wide open, so parents like me cannot ignore it.
I do not have a neat reform package. I have a handful of working ideas, shaped by what I see in my own house and client organizations. These are not rules or solutions. They are conversation starters.
High school: ideas worth trying, then testing
If high school is the last stop before the labor market for many students, it should give them two kinds of proof. Proof of thinking, and proof of doing. That could look less like seat time and more like mastery shown in public.
One thought: raise the status of in-class writing and oral defense. The take-home essay used to measure careful thinking. Today, it mostly measures how well a teenager can prompt a model and polish the output.
I am not angry about that; I just do not think it tells us what it tells us. Short, frequent in-class writing sprints, with a teacher present and no tools, might be a more accurate checkpoint. Add oral defenses where a student explains a claim, handles pushback, and shows the skeleton of their argument. That is not nostalgia. It is about validating the human part.
Another thought: require an audit trail when AI is allowed. If a student uses a model for research or drafting, show the prompts, the versions, and what changed. Grade the choices, not just the final paragraph. The adult world already works this way. Version history is accountability.
A third thought: product over points. Imagine a graduation portfolio that includes a research essay written under supervision, a data or media project built with AI tools and full version history visible, a talk delivered to an audience that is not just classmates, and one thing built for the community, with a real user who can say whether it helped. A portfolio like that travels better than a GPA. Employers understand it. Parents do too.
None of this requires new software or new buzzwords. It requires clarity about the purpose of a given task. Some tasks are for building mental muscle, so the tool stays off. Some tasks are for shipping a result, so the best tools should be on. Tell students which is which and why, then judge accordingly.
College: the monopoly is fading, the value is not
The four-year degree should not go away. However, it does seem that its grip weakens unless colleges change how learning is presented. The winners will resemble studios, clinics, and co-ops more than lecture marathons.
If I were advising a dean, I would push three shifts.
Shift one: AI literacy for every major. This is not about turning historians into coders. It is about historians using AI to annotate primary sources, pressure-test interpretations, and surface counter-arguments, then sitting for oral exams without any tools in the room. Business majors should run live cases that mix model-assisted research and analysis with memos and presentations written in their own voice. You can separate where the tool adds speed from where the student must own the thinking.
Shift two: stackable proof of skill. Accept industry certificates, apprenticeships, and real projects for credit. Tie them to seminars that sharpen judgment and communication, the things short courses skip.
Degree plus proof of skill beats degree alone.
Shift three: more time in the real world while enrolled. Co-ops, internships, and community projects are not optional side dishes but the main course in some terms. Faculty can supervise the reflection and the standards. Employers can supply messy problems that remind students what deadlines feel like.
The thorny question is cost. For many families, the math does not work. A practical path I often recommend in private is simple: start at a community college, transfer to complete your degree, and stack some focused certificates along the way, ultimately graduating with less debt and more evidence of ability. That choice deserves more respect than it gets.
How I think about AI rules, as a parent and a manager
Blanket bans fail. Free-for-all fails in a different way. The rule that works in my house and in my client teams is about intent and disclosure.
If the learning goal is to build a mental habit, the tool should be turned off. We say that out loud. If the goal is a product, use the best tools of your time, disclose that you did, and be ready to explain your choices. In practice, this means two things: clarity on the task's purpose and an audit trail. This is not a moral stance. It is quality control.
I also think we should worry less about “cheating” in the abstract and more about the assessment itself. Kids respond to the game we set up. Change the game. Make more of the grade live, in the room, with a person. Focus more on the grade for process quality, not just the final screenshot.
Equity is not a footnote
Everything above is harder in under-resourced schools. AI can widen gaps or narrow them. If we care about fairness, we will invest in teacher training, device access, and clear usage norms, then measure who gains. It is not enough to run a pilot and declare victory because a few kids loved their AI tutor. We have to look at outcomes by student group and adjust. That work is tedious and necessary, which is precisely why it gets skipped.
First year of work: finish the job
Learning does not end at graduation. In many fields, the first six to twelve months on the job behave like a capstone. The smartest employers I know treat it that way. They pair new hires with mentors, and I encourage them to review AI use in the same way they review code. They also ask for version history on anything that touches a client. They make writing and presenting a weekly sport. The point is not to gatekeep, it is to speed up maturity. Schools and companies can coordinate here. Both sides win when they do.
What I am still wrestling with
Three tensions keep coming up in my kitchen table debates.
First, speed versus depth. AI can make students fast. I want them deep. How do we keep the slow parts that build taste and judgment, without turning school into a museum of older methods?
Second, individual help versus shared effort. AI tutors lower the cost of one-on-one help. Teamwork still matters in the real world. How do we use personal tutors without letting collaboration muscles atrophy?
Third, credentials versus demonstrations. Degrees still open doors. Portfolios and trial projects may open them faster. How do we blend the two in a way that employers trust and parents can afford?
I do not pretend to have settled answers. I do think we can agree on the questions and start running some honest experiments.
A better contract with students
If I had to write it down in one paragraph, here is the contract I wish we would make.
We will keep sacred spaces where thinking is slow and fully human. We will also build modern studios where you learn to work with the best tools available, and we will judge you on both your process and your results. We will ask you to show your work, including your prompts and your edits, not to police you but to teach accountability. We will measure what you can do, not just what you heard. And we will keep adjusting, with you in the loop, because the world you are walking into is moving.
That is where my head is today, as a parent and as someone who helps leaders run teams. I am not looking for a silver bullet. I am looking for a clearer split between the two jobs of school, and a shared willingness to design for both. If we can do that together, our kids will not just keep up. They will be ready.
You can reach me at steve@intelligencebyintent.com. These are the kinds of things I think about while walking Magnus (who will be 10 months old this week).
Absolutely! We can’t ignore the tech (it isn’t going away), and pretending the world isn’t changing does our kids a disservice. My only concern is that it’s too late already and that before we know it, all human labor will be replaced by ai and robotics
100% agree, Stephen! Thanks for writing this post. I also have 2 kids in college and a third one in school.
For me, planning something is not the same as doing something. AI is good for the first, but the second still requires vision, grit, interpersonal skills.