“It’s less about human versus machine and what’s important, what are humans skills versus not, there are a lot of human skills, where you will not be able to retain a higher pricing power just because you are highly substitutable despite your specific human skill.”
– Sangeet Paul Choudary
About Sangeet Paul Choudary
Sangeet Paul Choudary is the Founder of Platform Thinking Labs and co-author of Platform Revolution, which has sold over 300,000 copies, and author of two other leading books on platform strategy. He is a keynote speaker and advisor to major organizations worldwide, and is a World Economic Forum Young Global Leader. Sangeet’s work has been featured on four occasions in the HBR Top 10 Must Reads collections.
Substack: www.platforms.substack.com
Website: www.platformthinkinglabs.com
LinkedIn: Sangeet Paul Choudary
What you will learn
- Exploring the impact of AI on skill premium and the future of work
- The rise of platforms and the commoditization of skills in the gig economy
- The dynamics between labor, talent, and capital in a technology-driven market
- How technology reshapes the value and distribution of work
- The role of AI in augmenting versus substituting human roles
- The significance of adaptability and learning in navigating technological change
- Strategies for individuals and organizations to harness AI for a prosperous future
Episode Resources
Transcript
Ross Dawson: Sangeet, it’s wonderful to have you on the show.
Sangeet Paul Choudary: Thank you, Ross. It’s such a pleasure to be here.
Ross: We’ve got to talk about a number of things. But you’ve been writing recently about the future of work, I suppose specifically, the impact of AI on skill premium. And I have some, probably somewhat different views here. So perhaps we can frame this a little bit as maybe not a debate, but a discussion. So that’s first of all, you can sort of lay out at a high level, your case, and I can sort of perhaps bring some other perspectives to bear as we go.
Sangeet: Sure, absolutely. My interest in, you know, the role of or the impact of technology on the future of work sort of started with the rise of platforms. So the first thing that we saw with the rise of platforms was that we saw new ways to organize work in markets. And Uber. And the gig economy is a great example of this when work gets extremely commodified. Right, when you think of a driver, whose natural advantage, apart from the license that they had for the taxi, was the ability to navigate the city and just the knowledge of the city and Google Maps GPS comes in, and that commoditization is that particular knowledge. Now suddenly, anybody without that level of depth about navigating the city can become a driver. That essentially shows how centralized market making of a commoditized skill, the more skill gets commoditized, the more it lends itself to a centralized market making with the right resource can be allocated to the right problem. That’s essentially what happens in gig economy platforms. And Uber being an extreme example, where you don’t really care who’s coming to take you from point A to point B, it’s just a resource being allocated to a problem.
Now, so the idea that kind of came out of that work, which I conducted with the ILO Future of Work Commission, was that the more skills get commoditized, the more they lend themselves to centralized market making, and the more the skill set commoditized, the more agency and power, especially in terms of setting your price, and a premium on your skill moves away from you to the platform that’s kind of making that allocation. And, broadly speaking, you know, markets exist everywhere. It’s not just Uber, but even if you’re working in an organization, you’re constantly being matched to opportunities, there’s an internal market, if you’re a freelancer, you’re obviously in an external market. But markets exist everywhere. That’s just the nature of the networked world we live in. So the essential idea would have to start with that markets are everywhere centralized market making is a thing. And secondly, the more skill is commoditized, the more it lends itself to centralized market making.
Ross: Just bouncing off that for starters, the key point there is fungibility as in, you know, one unit of labor can be replaced by another unit of labor, the substitutability. As you suggest, there is this tussle where people are looking to distinguish themselves in some form, and be able to charge a premium and the platform’s would encourage them, the more they can get this idea of fungibility, anybody can be substituted by anyone else. So we’ve seen, of course, there are a number of tiers to the work platforms. So going from ones which are truly fungible from all of the medical and mechanical turk type roles, to drive up type roles. And yeah, and again, there’s some degree of differentiation in terms of how friendly or clean or whatever the car is, but going through to where there are still a plenitude of expert platforms or various guises, where people are not fungible, we go through the different tiers of, you know, from low level through to mid tier skills through to high level skills. And these become a way to be able to access people with significant skill. So even if you are, you know, there’s probably some people who would not necessarily be on those platforms, but arguably, for example, a talent agency. So if you look at the major Los Angeles talent agencies, these are in a sense platforms. And so only through those platforms, you can access the Major George Clooney is or the other stars of this world. So those are in of course, you can cut the other ones such as, go some lemon group, which has some degree of, you know, a fair bit of replaceability. So, there are these pushers where people are trying to get more fungibility or substitutability. But that’s not necessarily the best for the economy or society as a whole and many people are pushing back about it. But I do agree that the force towards the commoditization of those skills is still pretty strong.
Sangeet: There’s a whole spectrum in terms of how commoditized work is which is why I called out the centralized market making. And there’s the degree to which it was commoditized. There are two aspects to this. And I’m going to just park one aspect very quickly, and then go towards the other which lends more itself in the direction of the topic of AI. But the first piece is just this: even in highly differentiated work, there are a lot of factors that determine your ability to charge a skill premium. And to what extent that premium is proportional to your skill. My argument there, which I’ve shown with a lot of evidence as well, is that because there’s a superstar effect that takes place, the higher you are rated, the more likely you are to get future work. And hence, the more likely you are to get higher ratings as well. There’s a feedback loop that isolates a set of superstars within highly specialized work. But the vast majority of non-superstars kind of languish in the middle. And the difference between the middle and the superstar is typically a factor of 10x. It’s not a factor of 1.5x. The additional issue that comes with this is that there’s an arbitrary ness to this inequality. If you’re early to a platform, the feedback loops can work in your favor, much longer, you can be featured on lists that are promoted by the platform much earlier. Twitter, if you were an intern at Twitter in 2007, you could be most likely to be part of people you should follow when it came out. And all those interns today have half a million followers on Twitter, not because they’re necessarily thought leaders, but because they were at the right place, at the right time.
So there are all of these factors, arbitrary inequalities, which get, you know, amplified because of feedback loops. So that’s one part of the argument that even in terms of the highly specialized work because of the mechanized nature of reputation systems, they are not necessarily meritocratic. But on the other side, where this argument moves towards AI is that just like GPS and Google Maps commodified, my knowledge of the city as a driver, new technologies, especially AI commoditize, a lot of different skill sets and move what is today highly specialized labor into highly commoditized labor over time. And as that shift happens, so I think the key assumption, or the key distinction we need to make is on whether AI is helping you become more differentiated, or is it helping you become more commoditized, or the driving towards more commoditization, and depending on where you lie on that spectrum, the outcome on your skill premium will vary a lot and independent of where you are on that spectrum – networks will play a role as well. And I’ll come back to that. So my point about AI is that AI is fundamentally different for two reasons. One is that you know, what’s similar to AI versus previous technologies is that there’s augmentation, when there’s augmentation, you can up level but very often, what happens is that the lower skilled workers are able to uplevel to a higher level of mediocre performance and the highest skilled workers are not able to command that premium anymore. And we’ve seen that with previous technological cycles.
Ross: Just on that point. I mean, not all studies have shown that they’re certainly, notably, the BCG study, and a couple of others have shown this effect that during generative AI had improved the performance of low performing people more than those of high performing people. But there are other studies such as a recent one on, you know, super forecasting prediction where it did, there was no differentiation between high and low performers. I think others where they have shown that there has increased so I think that, whilst it depends on the…the jury’s still out, as in it’s still relatively early data. I think it depends very much on the type of the task. So for example, writing tasks, I think it’s fairly evident that, you know, if you’re already very skilled, it’s not going to improve you much. Whereas if you’re not a very skilled writer, generative AI will help you substantially and there are a number of tasks like that. But as we move into more sophisticated tasks, then I believe that there is more potential. So for example, take strategic decision making, if you are a highly skilled strategic decision maker, and you were able to use generative AI effectively. Whilst it’s very difficult to measure the impact of that, it would be the case that it’s far harder for a less skilled person to be able to use these tools effectively to increase their capabilities, because they don’t even have a foundation for understanding how and where those will be applied.
Sangeet: Yeah, and these are fair points. My point about less skilled workers benefiting more versus more skilled workers. It’s not an AI point. It’s a technology point in general. We’ve seen that since the time that you know, chainsaws, the place taxes and so on. It’s a more generic point and I absolutely I agree that there are certain types of tasks where, or certain types of skills which will be augmented towards better performance and certain types of skills where the lesser workers will move further – that leads us to a different point, which is, ‘where do you lie in this talent versus labor spectrum,’ because in any economy, depending on what kind of skills are valued, talent ends up being closer to, seeing gains that are closer to the gains on capital versus labor. And so becoming commodified over time. And so with every new form of technology that comes in, what ends up happening is that what is classified as labor fundamentally changes, even if I go back to the Uber example, you have capital, which is venture capital, you have talent, which are the engineers at Uber, but their returns are closer to capital than to labor. And then there’s the drivers who are really labor. And so with AI, again, we need to ask the question, you know, are you talented? Are you labor? Are you going to be part of the value chain where AI is going, where you’re going to work on top of AI versus AI going to be part of the value chain, where it’s going to eat up certain aspects of your work?
Ross: I just to sort of interject there just for a moment, as well, I mean, back to your earlier point, around, essentially polarization. So the polarization of value in work has been well demonstrated, David Autor and a lot of others have shown that over decades and in different contexts and the different industries and in different geographies that we do have experienced a polarization of work. And there’s a whole array of factors behind that. But part of that overlaid with that is the allocation of value between labor and capital, and essentially, you know, that disconnected around 1970-ish. And when the US market in particular, where we started to see the accruals of labor productivity, going to capital as opposed to labor, and so whether that is a function of technology, or whether that is a factor of a whole array of managerial decisions overlaid with government policy, or other factors, which have meant again, power, essentially, the the ability to superior bargaining power or negotiating power in that situation, has been able to extract a greater share of value from increased labor productivity, that’s not necessarily a function of technology.
Sangeet: And that’s precisely my point. My point is not that labor was his capital, it was his talent. To the extent that talent, extract value from technology, it is able to position itself closer to capital, because capital needs to allocate itself towards maximizing the returns for technology, and talent is the mechanism through which those returns are maximized. So I fully agree that the capital versus labor polarization is Reaganomics, and it’s all those economic, you know, macroeconomic factors and decisions that were taken around the 70s to 80s. But especially in the information economy, the way to maximize the returns on capital has been by maximizing leverage of technology. And that’s where talent comes in. And so the question really is, which side of the technology spectrum do you fall on? Do you end up becoming commoditized? Labor? Do you end up becoming a mechanism through which the returns on technology can be maximized? So that’s, that’s the key idea, which, then essentially takes us to another aspect, which is the AI, unlike previous technologies, is self learning. And that then starts having an impact on talent as well, because one of the advantages of talent has been learning advantages, a lot of what we consider labor, and we’re really going to be moved far from the traditional blue collar white collar distinction that used to be there. That’s a very industrial distinction. The talent versus labor distinction to a large extent is to what extent does a learning advantage lead to an improvement in your skill premium? And what AI again, does it attack that learning advantage? Because AI is a learning technology as well.
Ross: It’s still not necessarily in the sense of so…if we are so to your point in what you’re right, I mean, the key point is that we have substitutability or complementarity of technology to human roles. So, in some cases, if it is substituting tasks within broader roles than that, it’s clearly been able to reframe the original role, others if it can act as a complement, then augments the value created in that role. So this from there, we this, I mean, one of the frames here is around is this is not a steady state system, in terms of there being defined roles or defined ways in which values are created. So as we start to see an evolution of value creation, the economic evolution of what the OR function of organizations for the roles of humans, those start to be complemented by AI in different ways.
Sangeet: The thing is that we are gradually building out this thing as we go through this conversation, I mean, I’m neither claiming that that’s static, the pie is not fixed, the pie keeps changing, how that how the pie gets distributed, keeps changing as well. And so the distinction in terms of where you land in terms of talent versus capital becomes important. Now, where you land in terms of talent versus capital has two more aspects to it. One, going back to the point is, is when, when technology impacts jobs – jobs are a bundle of tasks. And when technology impacts jobs, most technologies typically substitute certain tasks. Now to the extent that those tasks enable you to command a skill premium, because you end up doing a lot of things at work, not everything is what you’re paid for. Some of those things are just business as usual, to the extent that those tasks do not impact your ability to charge a skill premium, you can re-bundle around the other tasks and continue to charge a skill premium or even up level, but to the extent that they do, that is where, you know, if you’re a writer, and you’re primarily being paid to write content, and you’re not being paid so much to send emails, and you’re not to a large extent, or attend meetings, and to a large extent, your primary skill premium enhancing task has been impacted, you need to immediately start thinking, Well, how do you rebound? Well, maybe there’s an advantage now not in the writing in itself, it’s in the fellowship that I’ve gained, it’s in the relationships that I have. It’s in the unique insight that I have about a particular issue. So that re-bundling is key. So what’s really important is if you look at your job bundle, the job is a bundle of tasks. Wait, where are you chatting about a skill premium? And what’s the impact of AI on that, is that enhancing that, is that substituting that? I think that becomes a key aspect over here.
Ross: One of the questions there as well, who drives that read bundling. And so individuals, as you suggest, the writer may choose to present themselves in a different way or to find different ways to price for their work or whatever it may be. But organizations are now engaged in task reorganization, or as I think of it as work redesign. So rather than sort of looking at current state, and then saying, Well, how do we re rejigged current roles and redefine the edges of those to actually throw that entirely open, say, well, now we have analytic AI, generative AI and a whole lot of talent, and we can reconfigure that, and part of that being able to do that in ways, where people are not doing things which are commoditized will be there, you know, outsourced or given to low-cost people or done by generative AI, and where those start to emphasize that agenda via and coming back to the skill point where even if AI continues to, essentially gain skills or to learn on the job through all of its interactions, that’s still still the human AI system, as it were, can continue to evolve where the human is, again, not static, as a generative AI is not static, and together can actually still maintain that ability to create value in conjunction with generative AI, similar fashion that it has before.
Sangeet: Well, the question really is how many humans remain in the system in order to generate the same amount of value if you could keep a human in the system to get the human place plus AI component, but instead of 10? Humans, you’re now able to achieve the same productivity with one human, what do you do with the remaining nine, they end up slipping from the gallon bucket to the labor bucket. And what then ends up happening is, and we’ve seen this before, in the previous cycles of unemployment that have happened. Once you have, you know, folks entering the labor bucket, there’s a feedback loop that moves in that direction, your access to learning reduces your access to networking opportunities, that uses your access to just generating a premium on any skill reduces. And so that’s one issue that comes in the other piece.
Your point is very valid, who gets the right to the bundle, within organizations, to a large extent. They’re the right to the bundle is determined by the level of voting power that you can accrue in your favor. And what I mean by voting power is that if you think of your job as a bundle of tasks, or even as a bundle of goals, you are constantly participating in different teams contributing to their goals. If your contribution in a team gets replaced by technology, substituting your position in that team, because that’s the final thing I want to talk about, AI is unique because AI agents are goal seeking. They’re not just task substituting they can even combine tasks towards goals. So agents are the fundamental unit of the bundling in AI, the rebundling tasks back to goals and then goals are the bundled inside teams towards higher level goals. So if you were achieving a certain goal in a certain team and an agent can replace you, there are a lot of factors depending on which we can determine whether an agent can replace you and be happy to do any of those as well. But then the question that comes in, is your voting power that accrues to you reduces because now you have one less team voting in your favor, or in favor of the field at all. And so over time, these factors start having a feedback effect because of which, even if you retain certain roles, your ability to command a premium internally within an organization may go down over time. And so these are at this point, still hypotheses to be very fair, these are hypotheses that, you know, we’re validating at different levels. What we’re talking about in the abstract, I personally apply AI in a lot of my work. I see my productivity increasing big time, so I can see how it is applied in the right way. And if you are able to, because I have agency in my work, and in my workflow, if you’re able to rebundle your workflow around AI, just as an example, I don’t see the advantage of AI in writing, I see the advantage of AI in brainstorming, providing it a few different inputs, and then working around that and building new ideas that, in effect reduces my time to get creative, because there’s less of staring at a blank paper and more of brainstorming that’s been happening at the back end. So depending on what your bottlenecks are, in your work, if you can use AI to resolve those bottlenecks, if you can use AI to uplevel your skills around those bottlenecks. I think that those are ways in which you can augment yourself towards greater productivity. So I think that there’s an abstract place which was framing these hypotheses. And we need to keep gathering evidence of how people actually use it at a very specific level.
Ross: As you say, these are all hypotheses. And we will hopefully, by the nature of these kinds of conversations beginning to shape the way we want things to go, as well as obviously discovery because these are very deep uncertainties. Now I want to come back to the AI agents piece. And because I think that’s a deeper and very important topic. But just going back, I think there’s a couple of reframes. One is this idea of demand and elasticity for productivity. So there are some roles where you don’t need more, once you’ve got a certain amount, let’s say coffees, what you want, you’ve had enough coffee, you’ve had enough coffee, and you don’t need more skills and be able to create a wonderful coffee. So the demand elasticity is high.
Whereas there are some other things, and I think a good example is software program or software development in the broader sense. So not underlying skills, such as coding, where the amount of software which you have managed to develop up to date is a tiny fraction of what we could and we will and which could create superior value. So the issue then becomes well, it’s not as if we have one programmer, or one software developer, telexes, their productivity, so you don’t need another nine, then maybe one can multiply by 10 one multiplied by two, whatever it may be. And all of this because it is essentially full demand elastic for what it is they create, then you can continue both to generate more value supported by generative AI and continue to reward those people for that. And I think another overlay on this is also that essentially this ability to attract talent. And crudely, you can have a leader of leadership of a company to be able to take two frames. One is we have generative AI, which has significant capabilities, we will look to substitute as many as possible of our existing employees. Or you could have the other opposite spectrum if we have generative AI and we will look to complement to the highest possible degree, all of our existing people. And I think it’s fairly safe to say that the latter organization that looks to complement will find it far easier to attract talent.
Sangeet: If it is that straightforward in the sense that it comes back to the talent versus labor divide a lot of the things that you’re talking about is essentially what applies to talent. And I’m not disagreeing or with or disputing that it absolutely does apply to talent. And the definition of talent is the ability for capital to get a higher return on technology. So to the extent that humans can help capital gain higher return on new technologies coming in, that’s great. They will continue to survive in the talent spectrum and they will be able to capture all of the growing pie that’s coming up. My key consideration is just this. I mean, anecdotally, you can create a lot of arguments to support that talent will win and absolutely it will. The question is, what percentage of the population is going to be talent and what percentage is going to be labor if we have the last 10 years to go by? It is highly polarized in the, with a very small fraction moving towards talent and a very large fraction increasingly more moving towards labor.
So the real question is that if you are able to grow the pie, but if the section of talent or the section of the working population that can be considered talent actually reduces, how do you then distribute or redistribute the spoils to labor, I think that’s, that’s where it sort of breaks down. So I absolutely agree. And a lot of people listening to this podcast, I would argue, are probably thinking about these issues and trying to see how they can reframe their positioning on the talent side. So I’m not necessarily saying some of these issues apply to them. I’m not, I don’t agree that software engineering is going away, I do not agree with that at all. I think it’s going to reform and position itself towards gaining greater returns on new technologies that come in. But forms of software engineering that do not help you do that will move towards labor. And the same thing applies to writing, the same thing applies to any skill. So my point of view here is just simply this, if you can, with every news, shift in technology, if you can align yourself towards how you can generate higher returns on capital, using your skill, not for yourself, but for the capital owners, you are part of talent. If you own some of the capital, then that’s more power to you. But that’s how I think about the talent versus labor divided over here.
Ross: Yeah, and I do so as you know, as both you and I and everyone has seen, this reality of the polarization of Labor has continued. And the reality is, with a platform network economy, long tailed distributions, there’s many factors which are pushing towards this polarization of value, unfortunately, and then many of the policies we need to enact are to be able to reduce that inequality. And so one of the pieces and your articles make the case that skill premia will reduce and I think that’s that’s plausible, but I suppose a lot of our discussion so far is that skill Premier, maintained.
Sangeet: The skill premium is going to reduce for sure, if you’re going to move from talent to labor, if you’re going to reposition yourself to talent, then it can be maintained, it can even grow the question, My argument is more about the vast majority of the population that will inevitably move towards labor. And a large reason for that is the pace at which technology is changing is not the pace at which the larger population of humans is learning to reposition themselves. That’s just the reality. Most people in this world are not listening to podcasts like this and are not thinking about issues like this. They are getting impacted significantly shifting towards the labor side of the spectrum.
To the extent that you move towards the labor side of the spectrum, I am fairly confident that the skill premium is going to reduce. We’ve seen that happen in multiple cycles of technology before. So really, the question is not whether the skill premium will reduce or increase, it’s more about which, I won’t even say which those because the concept of those itself is fluid now, but who is going to land in labor and who’s going to land in talent, and what kinds of skills give you the position to sit on the talent side of spectrum, right. So if you’re really good at selling, you’ve always seen if you can improve your ability to sell the advantages of new technology, you now suddenly reposition yourself to generate returns on the technology for capital that has been invested towards it. So it’s really about how do you keep protecting your position on the talent side and keep increasing over there. And the talent capital relationship, thanks to Silicon Valley is very deeply ingrained, the more you position yourself on the talent side, the more you are in a position to gain some forms of capital over time. And so that segment of the population, I absolutely agree that it’ll, if anything, the trends of AI will accrue in their favor. And in the long run, inequality will increase because most of the population will actually move on the labor side of the spectrum. So I think that’s really the core of the argument whether we believe that most of the population will move to labor versus talent or whether we believe that most of them will reposition themselves as talent.
Ross: Yes, and I think if we know from that frame, we can see that there’s certainly, you know, it goes back to that question frame question before around saying, Do individuals do that for themselves? Do they rebundle or develop their own skills to be complementary to AI? And/or do organizations or employers reframe that? And I think everyone has the opportunity, of course, to complement their skills with AI for their own roles, or current or future roles. And the reality is a minority of organizations but hopefully, a significant proportion will start to redesign work, so that it does provide a scope for human capabilities to flourish which I certainly believe it has a place for. This comes back, you know, in a way, of course, to policy and to the ways in which it’s framed, to try to mitigate. I think as we agree there’s the strong force for polarization of labor, or as you say it from talent to lay, you know, move people move from talent to labor. But overall, I mean, I still think that there is a decent case for whilst with many challenges, the possibility, at least, of a prosperous future of work broadly told.
Sangeet: Yeah, for knowledge work, I think it’s still open, but for? It’s really a question. So I think what I do not agree with is people saying consultants are going away, or lawyers are going away, just because ChadGPT is here, it’s not going to be as binary and as deterministic as that. Instead, what we are going to see is that the skills that will help you generate higher returns on capital, for example, in consulting the skill that helps you generate higher returns on capital, our relationships and brand, and the ability to own the outcome through some form of insurance, nobody got fired for working with McKinsey, and so on. And the ability to generate returns on that capital is not necessarily about creating decks, that’s just part of the bundle. So AI can help you create decks. But if you own the brand, if you own the relationships, and if you own the ability to cover your clients as you have the ability to actually extract a higher premium. So it kind of comes back to that it’s not as straightforward as saying this profession is going away or that professional is going away. You need to think about what’s generating the terms of capital in a particular profession. And to what extent does your job control some of those factors on a day to day basis, that then helps you the, it gives you the voting power to the bundled things in your direction or even have a say, within the organization?
Ross: Absolutely and I think, you know, part of your point drawn out there is that you mentioned, for example, relationships, which are fundamental in any network economy or ecosystem. And these are basically human capabilities. So there are, whilst our cognitive capabilities, I believe, will still be extremely important, even in a world of extremely advanced AI, that has many of those human factors and the fact that we do like to deal with humans that maintain a lot of that centrally pricing power.
Sangeet: Absolutely and I’m not debating the fact that you know, it’s not a human versus machine debate, it’s a question of what kind of, to what extent do relationships deliver the returns on capital, if you are a nurse taking care of an elderly at an old age home, that’s not a relationship that’s delivering superior returns on capital. So that job will continue to exist, but it will move into labor. But if you are somebody who’s managing client relationships at a large law firm, that’s the relationship that generates returns on capital. So it’s really the place I think, where we lose this debate is when we start thinking about machine versus human, I don’t think that’s the real debate, the debate is talent versus labor. And labor is essentially, any human work that does not return, generate higher returns on capital, or generate supernormal returns on capital and talent is any human will that does generate supernormal returns on capital, eventually, all labor does generate returns on capital. I mean, that’s what all the Uber drivers are doing. But they don’t have the negotiating power to move that side of the pie in their direction, the supernormal returns are from the data scientist sitting at Uber. And so that’s the distinction that I’m trying to make. That’s what really matters, it’s less about human versus machine and what’s important, what are humans skills versus not, there are a lot of human skills, and the nurse taking care of the elderly is just one great example of places where you will not be able to retain a higher pricing power just because you are highly substitutable despite your specific human skill.
Ross: Then I do think that we will value more distinctly human capabilities, such as relationships and care and so on. And that and if the economy prospers, as you know, McKinsey and Andreessen Horowitz, predict the boost of generative AI, that surplus or whatever, you know, the way you frame that economy, I think many people will be choosing to spend that on people to be able to in all sorts of ways in terms of personal trainers or personal services or in terms of care, whatever. So I don’t necessarily think the pricing will push that down.
Sangeet: That’s interesting , that’s the distinction between value creation and value capture. I absolutely believe the personal trainers will create more value, but if somebody is willing to accrue the relation, the value goes back to the brand. That’s where the value will move. So if you’ve seen anything in the digital economy, it is just this that value creation does not sit with value capture. So I absolutely believe that human capabilities will generate a lot of value. I am highly skeptical that all of it will be captured in favor of human capabilities.
Ross: Yes, I think through the, we could almost call it the tyranny of platform structures. So maybe we could continue for a long time. But I think, given I suppose to start at least, very good conversation and discussion on this topic. So where can people…saw these articles on your substack to tell people where to go to learn more about your research and writing on this?
Sangeet: Yeah, absolutely. The latest work that I publish is on my substack, it’s platforms.substack.com. And I also publish about it on my website, it’s platformthinkinglabs.com. So these are two specific places where you can find my latest work in this topic.
Ross: Excellent. Thanks so much for your time and your insights and good let’s I think we should continue this conversation. Now. If you have a bit more data, it’s absolutely
Sangeet: I look forward to it.
Podcast: Play in new window | Download