分类目录归档:不是技术

比特币为何没能在大学中发明出来

原文链接:https://bitcoinmagazine.com/culture/bitcoin-could-never-be-invented-in-a-university

作者Korok Ray,是德州农工大学梅斯商学院副教授兼梅斯创新研究中心主任。

大意就是当前大学和学院的学术研究主要集中在某一学科的增量中,很难得到一个跨学科的创新,而比特币正是密码学、货币/经济学、网络科学的交叉成就。后面就给了大学发展的一些建议。

Since the announcement of its inception in October 2008, Bitcoin has reached a market capitalization of over $1 trillion. Its growth has drawn both retail and institutional investment, as the financial community now begins to see it as a legitimate store of value and an alternative to traditional assets like gold. Innovations in second-layer settlements like the Lightning Network make it increasingly possible for bitcoin to serve as a medium of exchange.

Yet, Bitcoin has a precarious and somewhat checkered history in academia. Curricula in universities are largely devoid of any mention of Bitcoin. Instead, the teachings are often left to student clubs and nonprofits. Over time this may change, as Bitcoin and the entire cryptocurrency market continues to grow, attracting attention from top talent in both engineering and business. Bitcoin’s absence from university is not a problem with Bitcoin itself, but rather the academy, with its insufficient embrace of innovation, its emphasis on backward-looking data analysis and its excessive preoccupation with individual disciplines rather than collective knowledge. Bitcoin can serve as an inspiration for what academic research can and should be. In fact, it presents a roadmap to change higher education for the better.

Similarities With The Academy

One may wonder why anyone should even assume a relationship between Bitcoin and universities. Technologists are in constant contact with real needs of customers today, while faculty develop basic science that (may) have application far into the future. After all, innovations like Facebook, Microsoft, Apple and even Ethereum were launched by young men who didn’t graduate from college. Yet, it’s no accident Silicon Valley and Route 128 both emerged in proximity to our nation’s greatest coastal universities. So, there’s certainly a correlation between universities and the tech sector. Even so, Bitcoin is different. Bitcoin has an even tighter relationship with its intellectual and academic roots. To understand this, we must peer into Bitcoin’s history.

At the turn of the century, a ragtag band of cryptographers, computer scientists, economists and libertarians — the cypherpunks — exchanged messages over an internet mailing list. This was an obscure electronic gathering of a diverse cadre of scientists, technologists and hobbyists who were developing and sharing ideas of advancements in cryptography and computer science. Here’s where some of the early giants of applied cryptography spent time, like Hal Finney, one of the early pioneers of Pretty Good Privacy (PGP).

It was on this mailing list that the pseudonymous creator of Bitcoin, Satoshi Nakamoto, announced his solution for an electronic payment system. After that announcement, he began to field questions from the forum on both the concept and its execution. Shortly thereafter, Nakamoto provided the full implementation of Bitcoin. This allowed participants of the forum to download the software, run it and test it on their own.

The Bitcoin white paper bears similarity to academic research. It follows the structure of an academic paper, has citations and looks similar to what any paper in computer science may look like today. Both the white paper and the conversations around it reference prior attempts at implementing the proof-of-work algorithm, one of the core features of Bitcoin. For example, the white paper cites HashCash from 2002, also part of the corpus of knowledge that preceded Bitcoin. Adam Back came up with proof-of-work for HashCash while trying to solve the problem of eliminating spam in emails.

Thus, Bitcoin didn’t fall out of the sky, but emerged out of a long lineage of ideas developed over decades, not days or weeks. We tend to think of technology as operating at warp speed, changing rapidly and being driven by ambitious, young college dropouts, but Bitcoin wasn’t based on “move fast and break things.” It was and is the opposite: a slow, careful deliberation based on decades of real science practiced not by kids, but more like their parents. The cryptography forum was similar in nature to an academic research seminar, where professional scientists politely but insistently attempt to tear down ideas to arrive at the truth. Though the concept of a white paper is now all the rage among alternative cryptocurrency coins and tokens, it’s the hallmark method of communicating ideas among the professional research community.

Even though the cryptocurrency economy today occupies center stage in the financial press and a growing share of national attention, when it emerged Bitcoin was as far from this as possible. It was obscure, technical and very fringe. In its long gestation from ideas that had been around for decades but unknown except to a small circle of cryptographers, economists and political philosophers, Bitcoin shares more in common with other radical innovations, like the internet, the transistor and the airplane. Just like those innovations, the story of Bitcoin is the triumph of individual reason over collective misperception. Just as the Wright brothers proved the world wrong by showing man could fly even though physicists claimed it was mathematically impossible, so too did Bitcoin confound the naysayers by building digital scarcity for the first time ever.

Why should we focus on Bitcoin rather than some of the other cryptocurrency tokens, like Ethereum? If you look under the hood, the majority of the innovation of cryptocurrency came from Bitcoin. For example, Ethereum relies on the same elliptic curve as Bitcoin, utilizing the same public key cryptography. Bitcoin emerged over a long gestation period and secret development by a pseudonymous applied cryptographer and was released and debated in an obscure mailing list. For this reason, Bitcoin shares many similarities to the arcane academic circles that occupy modern universities. No professional cryptographer made Ethereum; rather, it was a teenager who even admits he rushed its development. Thus, it’s only Bitcoin that has deep connection to the academy, whereas the more incremental innovations crowding the cryptocurrency space now are more similar to the small advances taken in the modern technology sector.

Differences From The Academy

Bitcoin differs from the academy in important ways. Most significantly, Bitcoin is fundamentally interdisciplinary in a way universities today aren’t. Bitcoin fuses together three separate disciplines: mathematics, computer science and economics. It’s this fusion that gives Bitcoin its power and shatters traditional academic silos.

Public key cryptography has been the major innovation in applied cryptography and mathematics since its conception 50 years ago. The core concept is simple: Users can secure a message with a private key known only to themselves that generates a public key known to all. Therefore, the user can easily distribute the public key without any security consequence, as only the private key can unlock the encryption. Public key cryptography achieves this through hash functions — one-way transformations of data that are impossible to reverse. In Bitcoin, this occurs through elliptic curves over finite fields of prime order.

But public key cryptography isn’t enough. Because Bitcoin seeks to serve as an electronic payment system, it must solve the double-spending problem. If Alice pays Bob using bitcoin, we must prevent Alice from also paying Carol with that same bitcoin. But in the digital world, copying data is free and therefore, preventing double spending is seemingly hopeless. For this, Nakamoto utilized the blockchain, a construct from computer science. Cryptographer David Chaum laid the groundwork for the concept of a blockchain as early as 1983, in research that emerged from his computer science dissertation at Berkeley.

The blockchain is a linked list that points backwards to the original (genesis) block. Each block contains thousands of transactions, each transaction containing the ingredients for transferring bitcoin from one address to another. The blockchain solves the double-spending problem because it’s distributed, i.e., publicly available to all nodes on the Bitcoin network. These nodes constantly validate the blockchain with new transactions added only when all other nodes on the network agree (consensus). In our prior example, when Alice pays Bob, this transaction enters the blockchain, which all nodes observe. If Alice tries to use those same bitcoin to pay Carol, the network will reject that transaction since everyone knows that Alice has already used those bitcoin to pay Bob. It’s the distributed, public nature of the blockchain that prevents double spending, a problem unique to electronic payments.

Indeed, Satoshi designed the blockchain specifically as a solution to double spending. It’s inherently inefficient, as it requires the entire network to constantly validate and reproduce the same data. This is also why most applications of blockchain technology outside of Bitcoin make little sense, as it forces an inefficient solution custom-built for electronic payments onto other applications that would be efficiently solved with central databases. The notion of a blockchain as a reverse-linked list by itself is not revolutionary in computer science, but its distributed nature specifically designed to prevent double spending is.

Even so, cryptography and blockchain aren’t enough. There needs to be a reason for the network to secure the blockchain. This is where the economics of Bitcoin shine. Nakamoto proposed a group of computers that would prove that the history of transactions did in fact occur. This proof requires costly work to be done. Nakamoto solved this by setting up a tournament in which individual computers (called miners) would compete to find a seemingly random answer through a one-way function called SHA256. The winner would receive newly minted bitcoin, which the network would release. The answer to the function must be sufficiently challenging that the only way to solve it is to deploy more computational resources. Bitcoin mining requires real computation and therefore real energy, similar to gold mining a few generations ago. But unlike gold mining, the issuance schedule of new bitcoin is known by everyone.

The economics of mining is the design of a contest that rewards new bitcoin to miners that solve a puzzle. This is a form of a microeconomics mechanism, i.e., a game economy design where individual agents compete for a reward. The macroeconomics of Bitcoin pertains to the issuance schedule, which adjusts predictably over time, with the block reward reducing by half every four years. This forces the constraint of 21 million bitcoin. This inherently limits the inflationary growth of the currency and imposes a constraint no fiat currency today must adhere to. The difficulty of the underlying puzzle adjusts roughly every two weeks regardless of the computing power of the network, providing a robust implementation despite exponential advances in computing power in the decades since Bitcoin launched.

This interdisciplinary feature of Bitcoin is existential, not incremental. Without any of its three components (public key cryptography, a backward-linked blockchain and a mining contest using proof-of-work), Bitcoin would not function. By itself, each of the three components consisted of a coherent body of knowledge and ideas. It was their combination that was Nakamoto’s genius. So too will future radical innovations need to link together multiple disciplines in existential ways, without which their combination would not survive.

Why Not The Academy?

Why could Bitcoin not have emerged out of the academy? First, Bitcoin is inherently interdisciplinary, yet scholars at universities are rewarded for excellence in single domains of knowledge. Bitcoin fuses together ideas from computer science, mathematics and economics, yet it is unlikely any single university faculty would have the breadth of knowledge necessary for interdisciplinary consilience.

Second, the academy suffers from incrementalism. Academic journals explicitly ask their authors for the incremental contribution their work provides to the literature. This is how knowledge advances, inch by inch. But Bitcoin — like other radical innovations in history, such as the airplane and the transistor — made giant leaps forward that would likely not have survived the peer review process of the academy.

Third, Bitcoin rests on libertarian political foundations which are out of favor among the mainstream academy, especially professional economists. Baked into the software are algorithmic representations of sound money, where the Bitcoin protocol releases new bitcoin on a predictable schedule. This is very different from the world we live in today, where the Federal Open Market Committee has full discretionary authority on the money supply. The cypherpunks who vetted Bitcoin v0.1 shared a skepticism of collective authority, believing technology and cryptography can provide privacy to individuals out of the watchful eyes of the government or any large organization.

Most economists don’t share this skepticism towards central authority. At least the social science community never took Bitcoin seriously. Besides, the Federal Reserve has an outsize role in both funding and promoting mainstream academic economic research. It recruits from top Ph.D. programs, hires bank presidents and governors who were former professors of economics, and encourages its staff to publish in the same academic journals as the academy. It is no wonder the university of faculty, influenced by the culture of the Fed, would not embrace technology that radically replaces it.

I asked all living Nobel laureates of economics to speak at the Texas A&M Bitcoin Conference, and all but one declined. Some admitted to not knowing enough about Bitcoin to warrant a lecture; at least they were honest about the constraints of the disciplinary model that they’ve so successfully thrived in. Others, like Paul Krugman, view cryptocurrencies as the new subprime mortgage (he also once predicted that the internet would have the same impact on the economy as the fax machine). Academic economists dedicated almost no attention to Bitcoin’s rise and even now remain ignorant of how the Bitcoin blockchain works, despite being the only real innovation in finance this last decade.

Bitcoin is first and foremost an intellectual contribution. It doesn’t require a deep knowledge of industry, special insight into the current practices of firms or knowledge of idiosyncratic details of the labor and capital markets. It didn’t build from existing practice, but rather from existing theory. For these reasons, Bitcoin emerged unapologetically out of the land of ideas, and should, in some sense, have come from the academy. An academic economist could’ve possibly designed the mining tournament, a computer scientist developed the blockchain and a mathematician developed public key cryptography. It takes an unlikely fellow (or team) to combine these three innovations together. Universities develop faculties with deep expertise in their individual disciplines but do nothing to tie the disciplines together in the way Bitcoin does. For this reason, Bitcoin couldn’t have emerged out of the university, even though it rests on disciplines well established within the university. The problem isn’t the knowledge itself but its organization. And therein lies the opportunity.

How Did We Get Here?

In its current form, the academy is not suited for innovations like Bitcoin. After students enter graduate school, they learn the techniques of their own discipline, which they use to publish in specialized journals that earn them tenure and future academic recognition with a small set of peers within that discipline. These isolated corridors of knowledge have ossified over centuries ever since the early universities. How did this happen?

There are two primary trends in the academy since World War II. By far the most important is the digital revolution. As computing power became accessible to anyone, the objective of science shifted from building theory to measurement. Suddenly, a wide array of social and natural science data was available to researchers from a laptop anywhere in the world. The growth of the internet spread data sharing and data availability, and advances in microprocessing power made large analysis of data cheap and easy. The academic community shifted en masse to data analysis and moved from trend to trend on 10-15 year cycles. The first cycle was on summary statistics and variance analysis, the second was on linear regression and the third on machine learning. When problems arose in the specific domain of each discipline, rarely did scholars return to their underlying theory for revision. Instead, they simply fed more data into the machine, hoping measurement error and omitted variables were to blame.

The growth of big data and statistics, in concert with machine learning, has led us to the present where artificial intelligence (AI) is a black box. No researcher can fully explain what exactly AI is doing. At the same time, questions have become smaller. Before, development economics as a field would ask, “Why is Africa so poor?” Now, research in the field asks whether placing a sign on the left or the right side of a bathroom door is more likely to lead to usage. This preoccupation with causality is intellectually worthwhile but comes at a high price, as often the researcher must narrow his domain to behaviors that are easily observable and measurable. The large, complex and mathematical theories developed after World War II were largely untestable, and so empirical researchers abandoned those theoretical foundations. Where once academics held the intellectual high ground by asking the biggest questions of the day, now empirical research dominates academic journals. Experimental physicists and empirical economists alike mostly cite other data-driven work.

As computers filtered throughout our society, students had exposure to computation earlier in their lives. By the time they arrived in college and in graduate school, they already had basic facilities with data manipulation and analysis. Why bother with mathematics when some simple experiments and linear regressions can provide tables of results that can be quickly published? Over time, students gravitated towards data work as the academic profession slowly migrated away from math.

It became far easier for journals to accept papers with some small experimental or empirical fact about the world. Given that editors and referees make decisions on academic research on a paper-by-paper basis, there’s no overarching evaluation of whether the body of empirical and experimental work truly advances human knowledge. As such, data analysis has run amuck with teams of researchers making ever more incremental advances, mining the same core data sets, and asking smaller and more meaningless questions. Does rain or sunshine affect the mood of traders and therefore their stock picks? Can the size of a CFO’s signature on an annual statement measure his narcissism and predict if he will commit fraud? (I’m not making this stuff up.)

One might think that advances in computation would have led research to verify some of the theories developed after World War II, but that has not been the case. In technical terms, many of those complex models are endogenous, with multiple variables determined in equilibrium simultaneously. As such, it’s a challenge for empirical researchers to identify specifically what’s happening, such as whether increasing the minimum wage will increase unemployment, as Economics 101 suggests. That has led to a turn to causality. But causal inference requires precise conditions, and often those conditions do not hold over the economy but rather in a few specific examples, like U.S. states that adopted anti-abortion laws at different times. The Freakonomics revolution in economics may not dominate the Nobel Prizes, but certainly has influenced the majority of published social science research.

The chief problem with this data-driven approach is its ultimately backward-looking approach. By definition, data is a representation of the world at a point in time. The entire fields of business and economics research are now almost wholly empirical, where scholars race to either gather new datasets or use novel and empirical techniques on existing datasets. Either way, the view is always from the rearview mirror, looking back into the past to understand what did or didn’t happen. Did low interest rates cause the Global Financial Crisis? Do abortions reduce crime? Does the minimum wage reduce employment? These questions are fundamentally preoccupied with the past, rather than designing new solutions for the future.

The second trend has been the shrinking of the theory community, both inside and outside the academy. The number of theorists has vastly shrunk, and they also have refused to collaborate with their much larger empirical and experimental colleagues. This tribalism led theorists to write ever more complex, intricate and self-referential mathematical models with little basis in reality and no hope for possible empirical validation. Much of game theory remains untestable, and string theory is perhaps the most extreme example of a self-referential world that can never be fully verified or tested.

Finally, academic theory trails technology by a long time. Often, mathematicians, physicists and economists provide ex-post rationalizations of technologies that have already been successful in industry. These theories don’t predict anything new, but rather simply affirm conventional wisdom. As the complexity of theory grows, its readership falls, even among theorists. Just like everything else in life, the tribalism of theory leads the community to act as a club, barring members who don’t adopt its arcane language and methods.

Thus, we’ve arrived at something of a civil war; the theory tribe is shrinking year by year and losing relevance to reality, while the empirical/experimental data community grows over time, asking smaller questions with no conceptual guidance. Both academics and technologists are left in the dark about what problems to solve and how to approach them. It also leads to a pervasive randomness in our collective consciousness, leading us to blow in whatever direction the winds of the moment take us. Economics has well-established theories of markets and how they function, yet technology companies are massive marketplaces unmoored in much of that same economic theory. Computer science rests on a sturdy foundation of algorithms and data structures, yet the theory community is obsessed with debates on computational complexity, while trillion-dollar tech companies perform simple A/B tests to make their most significant decisions.

We’ve reached a tipping point in the scale of human knowledge, where scholars refine their theories to ever more precise levels, speaking to smaller and smaller communities of scholars. This specialization of knowledge has led to hyperspecialization, where journals and academic disciplines continue to divide and subdivide into ever smaller categories. The profusion of journals is evidence of this hyperspecialization.

From Science To Engineering

Much future innovation will occur at the boundaries of the disciplines, given that much knowledge has already been discovered within existing disciplines, but there must be a greater transformation. Universities today still largely adopt the scientific method, establishing knowledge for its own sake and seeking to know the natural, physical and social world, but we shouldn’t stop there. Given their fundamental knowledge, scientists are in the best position to engineer better solutions for our future. Moving to an engineering mindset will force academics to design and implement solutions to our most pressing problems. In the long term, it will also close the gap between the academy and industry. The pressure students face to search for jobs and start companies, which takes a toll on their academic coursework, emerges because there’s a gap between the needs of the market and the academic curriculum. Were this gap to close, and students instead spent time in college building better solutions for the future, this cognitive dissonance would dissipate.

This transformation has already begun in some disciplines, like economics. One of the most successful applied areas of economics is market design, which unambiguously adopted an engineering mindset and delivered three Nobel Prizes in the last decade alone. These scholars came from engineering and adapted game theory to build better markets that can work in the real world, such as better ways to match kidney donors to recipients, students to schools or medical residents to hospitals. They also designed many of the largest auctions in use today, such as the spectrum auction of the government and the ad auction within Google. There’s no reason the rest of the economics profession, or even the rest of higher education and the academic community, cannot similarly position themselves towards adopting more of this engineering mindset.

Over time, closing this gap between the academy and industry will relieve much of the  
public outcry against escalating tuition and student debt. Once students and professors orient their research to develop better solutions for society, so too will their students and the companies that employ them. Students will no longer resent their faculty for spending time on research rather than teaching if that research directly creates technologies that ultimately benefit the students, future employers and society at large. Over time, this naturally will close the skills gap that America currently faces. Universities no longer will need to focus on STEM skills explicitly, but rather focus on providing technological solutions that will ultimately draw heavily from the STEM areas anyway.

A Call To Action

How can we reform higher education to produce the next Bitcoin? Of course, the next Bitcoin won’t be Bitcoin per se, but rather a first-principled innovation that conceives of an old problem in an entirely new way. I have three specific recommendations for university culture, priorities and organizational structure.

First, the academy must more explicitly embrace engineering more than science — even on the margin. The Renaissance and the Age of Reason have led American higher education to celebrate science and knowledge for its own sake. The motto for Harvard is “Veritas,” or “truth,” while that of the University of Chicago is “Crescat scientia, vita excolatur,” meaning “Let knowledge grow from more to more, and so human life be enriched.” These universities, based on the scientific and liberal arts traditions, have done much to establish the corpus of knowledge necessary for human progress, but this last half-century has been the age of the engineering universities, with Stanford and MIT competing to build solutions for the world, not just to understand it. This ethos of engineering should extend beyond engineering departments, but even and especially, to social science. For example, require all freshmen to take a basic engineering class to learn the mental framework of building solutions to problems. Economists have articulated the benefits of sound money for generations, but only through an engineered system like Bitcoin can those debates become reality.

This shift in engineering is happening somewhat within the social sciences. For example, the recent Nobel Prizes given to Paul Milgrom and Bob Wilson in economics celebrated their work in designing new markets and auctions to solve real problems in resource allocation problems that governments and society face. This community of microeconomic theorists are still a small minority within the economic profession, yet their work blends theory and practice like no other field and should have higher representation among practicing scholars. Universities should abandon the forced equity in treating all disciplines as equal, allocating an even share of faculty lines and research dollars to every discipline, no matter its impact on society. Instead, prioritize disciples willing and able to build solutions for the future. This culture must come from the top and permeate down towards recruiting decisions of faculty and students.

Second, reward interdisciplinary work. The traditional, centuries-old model of deep disciplinary work is showing its age, while most of the exciting innovations of our time lie at the boundaries of the disciplines. Universities pay lip service to interdisciplinary work as a new buzzword across college campuses, but unless the incentives for faculty change, nothing will. Promotion and tenure committees must reward publications outside of a scholar’s home discipline and especially collaborations with other departments and colleges. While large government agencies, like the National Science Foundation, have increased allocation of funding toward cross-disciplinary teams, when it comes times to promotion and tenure decisions, faculty committees are woefully old-fashioned and still reward scholars within rather than across disciplines. Over time, I expect this to change as the older generation retires, but the most pressing problems of society cannot wait and universities should pivot faster now. Unless promotion and tenure committees explicitly announce recognition for interdisciplinary work, nothing else matters.

Third, the academy must aim high. Too often, academic journals are comfortable seeking incremental contributions to the fund of knowledge. Our obsession with citations and small improvements inevitably leads to small steps forward. Academic communities have a reflexive desire to be self-referential and tribal. Therefore, scholars like small conferences of like-minded peers. Some of the biggest steps forward in the history of science came from giant leaps of understanding that only could have occurred outside of the mainstream. Bitcoin is one example, but not the only one. Consider the discovery of the double helix, the invention of the airplane, the creation of the internet and more recently the discovery of the mRNA sequence for the COVID-19 vaccine. True progress comes from unapologetically tossing out the existing intellectual orthodoxy and embracing an entirely fresh look. The standards of excellence for our faculty and students must insist they aim to solve the biggest problems facing humanity. Too often this discourse is silenced from campus, and over time, it erodes the spirit of our young people. To achieve this, allocate research funding based on impact and make these requirements strict.

The vast increase in wealth from the technology sector has put various pressures on campus. For one, it induces young students to drop out and start new companies, following in the footsteps of the young founders who dominate the technological and financial press. This happens only because there’s a rift between the rewards of the market and the activities of the university. Remember that Bitcoin emerged from a small community of intellectuals seeking to engineer a solution to an ancient problem using new technology. This could’ve easily occurred within the academy, and in some sense, it should have.

The corporate firm, either start-up or established, is the natural locus for incremental innovation. The constant noise of customer needs, investor demands and industry knowledge make it a natural place for small changes in society’s production possibilities. Radical innovation is uniquely suited to the academy with its longer, more deliberate timescale, access to deep science and isolation from the noise of the market, but it’s up to the academy to rise to that challenge. Let Bitcoin inspire us, so the academy becomes the quarterback and not just the spectator to the next radical innovation of our time.

元宇宙的道路迂回

看到一篇批判元宇宙的文章,metaverse == metaworse?

对元宇宙的质疑来源于,为了让人群更容易理解和使用元宇宙,使用了过度的拟物化,比如元宇宙银行,就真的搞个虚拟网点,虚拟客户经理,虚拟窗口,问题是只是方便了理解,容易学习,但却阻碍了真正的进化。

过度拟物化也带来元宇宙的能耗虚高。

想起一个例子,新生代对保存/Save的图标的理解就是一个方方的的形状,不知所起。因为软盘在90年代末期就开始消失了……当时windows软件的设计为了让80年代的使用者理解文件被保存下来,使用了保存到软盘的概念来指代Save,然后就是陈陈相因,直到今天。

元宇宙的几个支撑成功的关键点:

有内容创作者——这个跟WEB 2.0是一样的,只是3.0里面是去中心的,创作者完全拥有作品的所有权。

完善的数据经济——作品归属后可以自由交易,其他非展现型的数据也是一样。

加密货币——这是支撑去中心化的基础。

继续阅读

发达国家没有super app的原因

主要是因为美国和欧洲,iPhone和标准Android(GMS)的联合覆盖率非常高,以至于形成了双头垄断(duopoly)。

双头垄断下,Apple和Google自身就可以成为Super App,在手机OS之上的各类软件,其横向生长的空间并不大。

虽然Meta(facebook),Twitter都很强,但要脱离Apple和Google提供的内置服务,反而是阻力重重,比如支付,很难独立于Apple Pay和Google Pay存在吧,就好像不大可能脱离Visa和MasterCard一样。

AI DJ

AI承担DJ的工作其实是很正常的,比如现在流行的音乐APP里面,其实都有AI选曲播放的频道。

更多的是,AI可以自行根据旋律创作一些背景音乐(鼓点、midi等),如能,也可以配合现场的IoT环境,控制灯光。

AI可以通过观测现场的人群的反应(CV能力),再进行播放的调整。

当然了,作为chatbot,可以接受现场客人的点唱,也可以提供虚拟形象。

元宇宙是多什么?

之前我们work everywhere,或者是多屏协同,主体还是一个人,一个工作者,在不同的场景下,使用不同的设备,工作,娱乐,比如收发邮件、写文档、看电影、听音乐,等等。

这个比较好理解,我们要做的事情,在云端同步,客户端则有若干个,充当我们接入的载体/入口。

元宇宙不可能再延续这样的做法,一个人仍然是单线程地进入和退出各种虚拟空间中,效率太低太低了。

可以考虑的是元宇宙的多身份,不同虚拟空间,有AI结合真实个体参与决策的意识,在大多数情况下,这些AI结合体可以自行在元宇宙中饰演本体的角色,只有关键决策才将本体接入。

比如工作空间、个人创作空间、社交空间等等,可以并行不悖,最大化本体的能力。

当然了,要使用这些结合体AI,需要保证本体是足够能赚钱的,否则,元宇宙就只剩下花钱娱乐一个途径了。


补充一个接受度的观点,Gen Alpha(10后)可能更容易接受在metaverse中进行的retail。

另一个问题是,在虚拟世界成为生活标配之前,上几个世代的人大多数会生硬地将实体空间中的经验复制给虚拟世界,比如线上零售店。这带来一个悖论,一方面是虚拟世界会有更优的超出实体世界体验的体验,这却无法回推给实体世界(因为超出了),另一方面,复制实体世界的体验,又得不到新世代的元宇宙原生住民的支持。

未来很可能零售将不复存在,实物的拥有行为将被服务所取代。

无人机配送的负面观点

几年前就看到有物流公司使用无人机配送,多是噱头为主。业内人士陆续批驳这些噱头。翻译了一篇:

亚马逊(Amazon)前高管、供应链顾问布里顿•拉德(britton Ladd)表示,商界最糟糕的想法之一是“与使用无人机送货有关的狂热”。

在领英(LinkedIn)的一篇帖子中,拉德表示:“尽管使用无人机是最低效、最昂贵的快递方式之一,但许多公司和个人仍在吹嘘无人机将如何给零售和物流带来革命。”

“假的。让我把话说清楚。使用无人机递送可乐、牙膏和牙线,或者递送非柜台药品,当然可以使用无人机完成。毫无疑问。”

“必须要问的问题是——使用无人机是正确的解决方案还是最好的解决方案? 答案是否定的。”

拉德认为,无人机交付的产品最能被描述为低价值和低必要性。“是的,可以用无人机递送处方,但这种递送的比例将非常低。”

他指出,根据亚马逊的内部文件,到2025年,每个包裹的无人机送货成本将为63美元(约合人民币635元),而这家电子商务巨头的目标是每年用无人机递送100万个包裹。

“这几乎是亚马逊地面配送平均成本的20倍。”

“目前,通过第三方快递合作伙伴运送包裹的费用估计在4.5至5.5美元之间。通过亚马逊自己的物流网络运输的产品,每包大约要花费3.47美元。”

Ladd表示:“从研究的角度来看,当一个更经济、更有效的解决方案是为货车装载最受欢迎的产品,并通过应用程序为货车提供访问权限时,我对使用无人机的智慧提出了挑战。”

这种模式允许消费者一年365天,24/7,“呼叫一家商店”到达他们的家。在许多住宅开发项目中,厢式货车可以自动驾驶。”

一辆货车可以向多个客户进行多次送货,而不是一架无人机进行一次送货。Ladd认为,任何最后一英里配送解决方案的最佳目标都是许多车辆进行多次配送的比例——这是无人机无法实现的。

“我还想指出,无人机实际上是会飞的吹叶机。对我来说,难以置信的是,任何业主协会都会批准使用吵闹的无人机运送士力架(Snickers)巧克力棒和健怡可乐(Diet coke)。(我和几个HOA谈过,他们联系了我,我警告他们不要批准无人机。)”

另一个问题是责任。最近,一架亚马逊无人机在俄勒冈州坠毁并引发了一场森林大火。

“当无人机坠毁在房屋顶部并起火时,谁将为此负责?”提示:推出无人机的不会是无人机公司或零售商,”拉德说。

他总结道:“无人机有自己的位置吗?是的。绝对的。我不是反无人机。我反对在没有意义的地方使用无人机。”

注:HOA(Homeowner Association),房主协会

传统零售店智能化 或 无人零售

首先是要考虑最小的修改,毕竟大家手里钱都不多,改动太大或者推倒重来的话,一方面是成本高,另一方面智能店真正完全成功的还不多,大多数都是概念店和体验店。

店面大小,不适宜太大,太小也不好,就考虑那种连锁便利店1~3倍的规模好了。形状规范一点。

SKU控制一下,不超过600?有更多的话,请用减法。

尽量使用现有的基础设施,比如网络,电,监控摄像头,收银台。


AI相关:做现在SKU的数据收集和采集,买也可以。训练模型–>生产环境下的重训练(比如增加或删除SKU等)

需要的硬件或基础设施:摄像头、视频处理硬件(如GPU)、QR Scanner(当然,摄像头也可以)、AI模型服务器(边缘计算够便宜也可以)

AI,其实是CV,也该考虑人脸等隐私特征的保护。


无人零售用AI的目的:

识别顾客,绑定到一个实体/虚拟的账号,并100%跟踪其店内行为

产品识别,品类、数量、状态等

前两者的关系,也就是谁拿了什么东西,多少东西

自助结账。

无人零售的退化版本就是自助售卖机、自助餐食等。

技术债务

第一次听到这个名词是在14年,跟比我大十多年的几十年软件经验的Moh开会时,他讲起这个。

现在回想起来,这些人做to B软件做得多,维护时间长了,才能深刻理解这一点。

to B,尤其是大B,做好的东西不能随便更换,或者升级,因为成本是一回事,使用习惯,内部各种墙,认证等等。

于是规划和设计得不好的地方,实现得有问题,文档的缺乏,不够前瞻的地方,随着时间的推移,需求的变化,技术的演进,人员的流动,旧有的,无论好坏,陆续有一些成为/增加了技术债务。

要认识到技术债务是个必然现象,才能客观对待。

某些技术债务的几个处理方式:

  • 开放软件接入标准
  • 模型抽象化,避免模块锁定供应商
  • 渐进式的改进,不要期望一步到位

当然了,都是软件人员的想法,放在中国,很多时候是直接关停,或者推倒重来,所谓的大B,也可以重新招标了。

天气冷,手写笔不工作了

以前也时常发生,在dpt rp1的文档上做标记的时候,时常写不出来,一直以为是笔的电池不够,于是充电,然后就又能写了。

然而这个春节比较冷,而且疫情的关系,一直在家里,看书的时候想标注写什么,动不动就写不出来,然后去充电,完了还是不大好用。于是觉得是手写笔的电池寿命是不是不行了。。。

上网查一下,贴吧里有人是这么说的:

接触了带电/漏电设备。这是我要重点讲的一个原因。在其他各种地方都没见到过这个原因,而它恰恰是最困扰我的。由于我经常要边看论文边查资料,因而经常把电子纸和电脑重叠放在一起使用。以前一直没想到会是这个原因,直到最近偶然发现,和电脑重叠放在一起,出现书写不灵问题的概率就会很高。这可能与电源插座、电脑外壳材质、充电线都有关系,即可能是因为我的电脑漏电。

我的桌面上确实又是手机又是Mac,还有很多充电线,电场和磁场估计本身就比较复杂,手写笔这个又是较灵敏的设备,于是受到干扰就写不出来了。

于是我把rp1端起来书写,一切又回到正常。

解答了困扰很久的烦恼。不然去再买一根手写笔也有点浪费了。

希望索尼大法能继续持久地工作。

无头商业 和 智能场所

先解释一下题目,无头商业(headless commerce)是19/20年开始兴起的电子商务系统/平台的概念。其大概的意思就是,前端的需求变化频繁而载体各异,因而要与后端的商品管理彻底分离,在后端的商品管理平台看来,只需要提供WEB API就可以了,没有前端的头,所以称为无头商业。

决定如此决绝的分离,是两侧的变化的可能性实在不成比例。

同样,最近考虑智能场所(Smart Venue)的时候,也发现类似的特点:各种场所形形式式,千变万化,很多时候依赖于手机上的应用或者定制化的小程序,但手机和其他设备的互联结果,调试起来又是很痛苦的事情。

设备总的来说是类似的,如果设备过于多元且不一致,则智能场所就是一堆的定制项目,花钱赚吆喝。要维系着一个合理的可持续的商业模式,必须考虑硬件设备上的趋同。

硬件设备趋同,软件却是不一致的,不同的场所所需要的界面、流程、后台均有差异。但如何降低这些软件层面的定制化成本?

可以借鉴无头商业的概念,将底层软件为上层业务提供硬件互联互通、贴近业务的API,上层业务方可以以无代码方式进行业务开发。