top of page

Silicon Valley Jargons


To many, a startup is just a new small business. While a small business may be content to remain small, however, a starup intends to grow into a large company. Startup founders are driven to create something that impacts their industry or market in a significant way. A starup is searching for a disruptive business model that repeatably and scalably achieves product-market fit. A key work is "searching"; a starup's model is unproven and its market uncertain. Because their offerings are speculative, starups have a high failure rate.


A founder envisions something great but lacking in the world and then takes the crucial next step of creating an organization to make it a reality. A founder often starts a company in order to manifest his or her idea, initially assuming all risk and reward. When several people unite to start a company, they are know as co-founder. Between co-founders there is a sharing of ownership, the percentages determined by mutual agreement. While the occasional starup is launched by several more-or-less equal co-founders, more commonly there is a key, initiating founder, with the remainder of founding team referred to as co-founders.

Angel Investor

In the world of starup financing, there is a gap between friends and family and venture capital. Here lies the angel investor. While venture capital investments are typically nothing less than $1 million, a typical angel investment can range from a few thousand dollars to the low millions. Angel investors may take convertible notes or participate in a seed round, among themselves or alongside a VC.

Venture Capitalist

"A venture capitalist is a professional who invests third-party funds in early-stage companies. This contrasts with an angel investor, who typically invests their own funds. Venture capitalists invest capital in these companies in exchange for an ownership position in the firm and its potential financial gains. The venture capitalist is primarily using monies from its accredited or institutional investor clients, which it amasses into a pool of capital called a fund, typically structured as a limited partnership. These investor clients are thus known as limited partners and, in this context, the VC as the general partners. A single fund may invest in as many as 20-40 startups. The limited partners share is the gains and losses. Limited partners generally promise their capital for only a limited amount of time, such as 10 years. The general partner is entitled to a fixed percentage of profits, which they can take as exits close. But if the profit level is not maintained through future exits, the limited partners are entitled to claw back enough early profits to bring the general partners into compliance.

The venture capitalist might work independently or as a partner in a venture capital firm, for short a VC firm or just VC. Money raised from a VC firm is referred to as venture capital. Only a general partner has decision-making authority; entrepreneurs are initially more likely to interact with a VC firm's associated. Some VC firms use contracted scouts, often founders of former portfolio companies, who are compensated with a stake in the investment bankers. Now people go into VC firms straight out of school. Also, some serial entrepreneurs have emerged as venture capitalists, leveraging their hands-on, operational experiences."


"The dream of every Silicon Valley startup is to disrupt an industry-to produce an innovation so different firm what came before that it is a game-changer that upends the status quo and becomes the next big thing. Such innovations create disruption as those relying on traditional business models are left at a competitive disadvantage. The idea of " disrupting the X industry""is also often expressed in shorthand as""Disrupt X."" disruption often involves seeing what others don't; sometimes it involves seeing the world in a whole new way. That is referred to as a paradigm shift. Althought startups should think big, they should focus on a well-defined and achievable vision, rather than try to boil the ocean.

When expressing a desire to engage in disruptive innovation, the phase BREAK SHIT may also be used. This is the opposite of "if it ain't broke, don't fix it" and reflects a desire to upend established ways of doing things. The constant disruption that occurs in Silicon Valley leads to a cycle of creative destruction, in which new products and new companies are constantly emerging while old ones fade away."

Disruptive innovation The term disruptive technology was introduced by Joseph Bower and Clayton Christensen in a 1995 article titled "Disruptive Technologies: Catching the Wave" and then further explored by Christensen in his book The Innovator's Dilemma. Christensen later changed his term to disruptive innovation to capture the idea that the business model that the technology enables is what creates the disruptive impact.

X For Y

X for Y is an expression of an analogy in which an entrepreneur's product or service is said to have a similar business model to a well-known, established company(X), but targeting a different market segment or use case (Y). For example, " Airbnb for boats" describes a startup that applies a sharing economy business model to boats, rather than to homes and apartments. On the theory that "there is nothing new under the sun," or as least nothing easy, the X for Y" approach is a natural one in adapting a successful formula to a new situation.

Stealth Mode

Stealth mode, a term borrowed from the military, refers to a period when starups keep quiet about their plans. Stealth mode is primarily meant to prevent a competitor, especially one with greater resources, from getting wind of the idea and developing it first. It is particularly relevant before IP is filed.


A sponsor company crowdsorces services, ideas, or content by soliciting contributions from a large number of people (a crowd) rather than from employees in a chain of command or from organized suppliers. It is an alternative way to generate input on , or actions that support, a company's business activities. Crowdsourced content that is intended to persist online is known as user-generated content.

Fail Fast

Traditionally, failure was only considered as a negative, but Silicon Valley has developed a new way of thinking about it. Because innovation rarely takes the form of a straight line, failures are inevitable on the way to success. From this perspective, in Silicon Valley failure is commonly worn as a badge of honor, not shame, among entrepreneurs. The key is to effectively learn from the failures during the course of creating, exciting, refining, interacting, or pivoting.

Design Thinking

Thinking like a designer means finding ways to seamlessly and simply meet human needs, while balancing technical feasibility and economic viability. Although design thinking has its origins in processes used to design physical products, it can apply to many disciplines. Some of its steps are similar to those of Lean Startup, but design thinking is a more collaborative and deliberate process focusing on more expansive problems and thus more concerned with the process of idea generation known as ideation. It has been applied, for example, to designing an activity meter and motivational website to fight obesity, a neonatal baby-warmer for use in rural areas, and ways to increase the accessibility of quality higher education. It is used both in traditional corporate settings and in the developing world.


"To scale, in its simplest form, means to grow from some current state to a much larger one. A key question VCs will use as a filter to evaluate any starup concept is ""Will it scale?""

The ability to scale requires two things: a practically limitless total addressable market and diminishing marginal costs. The market can grow by providing more services to existing users, but ultimately it will be necessary to serve a broad array of users. Diminishing marginal costs allow a business to grow quickly because less work is needed to obtain and serve each additional customer."


In startup culture, to "eat your own dog food" is to use your own product service internally as a way to validate its quality and capabilities.

Beta Test

"Beta testing is also sometimes referred to as user acceptance testing (UAT) or end user testing. In this phase of software development, applications are subjected to real world testing by the intended audience for the software. The experiences of the early users are forwarded back to the developers who make final changes before releasing the software commercially.

For in-house testing, volunteers or paid test subjects use the software. For widely-distributed software, developers may make the test version available for downloading and free trial over the Web. Another purpose of making software widely available in this way is to provide a preview and possibly create some buzz for the final product."

Agile Development

"Agile development methodologies focus on developing software by iterating in short cycles and adapting. They promote respect for and connection among motivated individuals by emphasizing personal responsibility.

Agile was a reaction against waterfall development, which takes a more sequential and monolithic approach, fist designing, then coding, and finally testing the entire system. And as with navigating a waterfall, backing up is highly discouraged. Such and approach assumes omniscient understanding of a static world, then demands obedience from developers and acquiescence from customers. "

Scrum/Extreme Programming Scrum is one Agile methodology that emphasizes the dynamics of each iteration, specifying various roles, meetings and workflow processes. In particular, Scrum separates the role of product owner, responsible for the vision, priorities, and release schedule of a product, from that of Scrum master, responsible for facilitating meetings and ensuring that processes are both well-designed and enforced. Another Agile methodology, Extreme Programming(XP), emphasizes efficient and purpose-driven development processes with multiple levels of feedback. A notable one, test-driven development, involves writing test cases before the code to be tested.


"Iteration is the process of continually refining and tweaking the feature of a product in order to improve it. After introducing a new product, a company will receive and interpret feedback from customers and likely choose to refine the initial concept to incorporate their concerns and experiences. Each such cycle is considered an iteration. To iterate is to perform a sequence of iterations.

Iteration is a key principle of agile development methodology and the Lean Startup movement. Iterations, by joining the end of the development process to the beginning, improve both by allowing requirements and design to emerge over time while reducing risk of incompatibilities through frequent product integration. The Scrum methodology specifies iterations by fixed blocks of time, and cuts back on the scope of work as needed(time-boxing)."

Lean Startup

The term Lean Startup is inextricably linked with serial entrepreneur Eric Ries, who introduced it in a 2008 blog post and published the book by that name in 2011. The Lean Startup methodology brought together Agile methodologies and his teacher/mentor Steve Blank’s work on customer development. The latter emphasized getting extensive feedback on all aspects of the firm’s business model, including product features, pricing, distribution channels, and customer acquisition strategies. The Lean concept has developed into a full-fledged movement, embraced by thousands in companies of all sizes and spawning its own mini-industry of conferences and consultants.


TAM, SAM and SOM are acronyms that represents different subsets of a market.

TAM : Total Available Market is the total market demand for a product or service.

SAM : Serviceable Available Market is the segment of the TAM targeted by your products and services which is within your geographical reach.

SOM : Serviceable Obtainable Market is the portion of SAM that you can capture."


A prototype is a simplified version of a product that is intended to convey the look and feel, i.e., design elements including the layout and basic workflow, along with varying amounts of functionality, in order to make the concept more concrete with potential investors and validate it with potential customers. At one extreme, a prototype may be a functioning system lacking aesthetic elements or account management features. At the other, it may be a wireframe or, more generally, a mock-up, indicating only what each screen looks like and how control flows from one to another. The underlying principal is related to that of the MVP, i.e., that one should implement only enough to learn what steps to take next, but the prototype may fall short of the viability criterion. Some prototypes may be intended to be discarded once the lesson has been learned, while others form the basis of the ultimate product. A prototype may be presented as a demo, short for “demonstration.”

MVP(Minimum Viable Product)

One defining element of the Lean Startup method is the need to go to the customer, early and often. Entrepreneurs following this approach will often bring along their minimum viable product (MVP)-the first version of a product that demonstrates an ability to fulfill the startup’s value proposition to any extent. It is the most simple or basic (i.e., minimum) product instance that can still perform the required functions(i.e., viable).

The motivation for an MVP is to allow a company to move forward faster by avoiding spending months, if not year, tweaking its product before the public ever even sees it. There’s no need to build out all potential features of the product that could serve some users in some situation, when a bare bones version enables learning and may even be adequate for early adapters. An extreme MVP, in which much of the processing is done by humans rather than being automated, is (for obvious reasons) know as a Wizard of Oz MVP.


User interface design (UI) or user interface engineering is the design of user interfaces for machines and software, such as computers, home appliances, mobile devices, and other electronic devices, with the focus on maximizing usability and the user experience. The goal of user interface design is to make the user's interaction as simple and efficient as possible, in terms of accomplishing user goals (user-centered design).

Good user interface design facilitates finishing the task at hand without drawing unnecessary attention to itself. Graphic design and typography are utilized to support its usability, influencing how the user performs certain interactions and improving the aesthetic appeal of the design; design aesthetics may enhance or detract from the ability of users to use the functions of the interface.[1] The design process must balance technical functionality and visual elements (e.g., mental model) to create a system that is not only operational but also usable and adaptable to changing user needs.

Interface design is involved in a wide range of projects from computer systems, to cars, to commercial planes; all of these projects involve much of the same basic human interactions yet also require some unique skills and knowledge. As a result, designers tend to specialize in certain types of projects and have skills centered on their expertise, whether it is a software design, user research, web design, or industrial design.


In commerce, user experience (UX) is a person's emotions and attitudes about using a particular product, system or service. It includes the practical, experiential, effective, meaningful and valuable aspects of human–computer interaction and product ownership. Additionally, it includes a person's perceptions of system aspects such as utility, ease of use and efficiency. User experience may be subjective in nature to the degree that it is about individual perception and thought with respect to a system. User experience varies dynamically, constantly modifying over time due to changing usage circumstances and to changes to individual systems as well as to the wider usage context in which they operate. In the end, user experience is about how a user interacts with, and experiences, a product.

User experience (UX) design is the process design teams use to create products that provide meaningful and relevant experiences to users. This involves the design of the entire process of acquiring and integrating the product, including aspects of branding, design, usability and function.


In computer programming, an application programming interface (API) is a set of subroutine definitions, communication protocols, and tools for building software. In general terms, it is a set of clearly defined methods of communication among various components. A good API makes it easier to develop a computer program by providing all the building blocks, which are then put together by the programmer.

An API may be for a web-based system, operating system, database system, computer hardware, or software library.

An API specification can take many forms, but often includes specifications for routines, data structures, object classes, variables, or remote calls. POSIX, Windows API and ASPI are examples of different forms of APIs. Documentation for the API usually is provided to facilitate usage and implementation.

API, short for application programming interface, refers to a set of protocols through which programmers can access external services (from other programs), which may query proprietary data, access sensors or actuators, or perform computations. In essence, an API acts as a library of pre-established communication links that allows a developer to build a computer program that “talks” with a different compute program in order to make use of a desired service. Without good APIs, Google maps, Yelp reviews, and YouTube videos would not be so ubiquitous, it would be impossible to copy text from Microsoft Word and paste it to Facebook, and movie tickets could not be purchased via mobile app.


SDK stands for “Software Development Kit”, which is a great way to think about it — a kit. Think about putting together a model car or plane. When constructing this model, a whole kit of items is needed, including the kit pieces themselves, the tools needed to put them together, assembly instructions, and so forth.

An SDK or devkit functions in much the same way, providing a set of tools, libraries, relevant documentation, code samples, processes, and or guides that allow developers to create software applications on a specific platform. If an API is a set of building blocks that allow for the creation of something, an SDK is a full-fledged workshop, facilitating creation far outside the scopes of what an API would allow.

SDKs are the origination sources for almost every program a modern user would interact with. From the web browser you work on all the way to the video games you play at the end of the day, many were first built with an SDK, even before an API was used to communicate with other applications.

Open Source

Open source is a development model that actively encourages universal access to technology. By making software open source, the copyright holder grants the public the right to study, alter, and redistribute it. Open-source projects select from various standard licenses that limit, in different ways, what people may do with the code after downloading it. For instance, licensers might not allow closed-source modifications to the code. Often, coordination, to open-source projects. This phenomenon is supported by decentralized version control, tools and procedures that help people to work independently on different aspects of a project without unnecessarily interfering with one another and also enable existing projects to spawn new ones, so that they can evolve in diverse directions.


A startup that has received at least one investment round that included a venture capital fund is said to be venture-backed. Businesses financed by venture capital will be high-risk and high-potential. In exchange for assuming this substantial risk to their investment, the venture capitalist will take an ownership (and control) position in the company.

“Pulling yourself up by your bootstraps” is an Old English saying that refers to the laces (or straps) on one’s boots, and the use of these to move the entire boot forward (perhaps literally out of the mud). It indicates making do with what you already have in order to have more (as in self-reliance) and is also the basis of the term “boot” as applied to starting a computer (using primitive systems to initialize more complex ones).

In Silicon Valley, bootstrapping has come to denote a founding team self-funding their venture. The startup may forgo angel and venture backing entirely, or, as in the case of eBay, bootstrap for several years while developing market traction.


Crowdfunding is the process of funding a new project or business by raising money from a multitude of people, often unknown to the entrepreneur, who make large or small investments in the new venture.

The crowdfunding model has three components: the initiator, who proposes the idea to be funded; the people who provide funds to support the idea, known as backers; and a “matchmaker” organization that brings the two together. Typically, the matchmaker is a web site dedicated to that purpose.

Two of the most prominent crowdfunding firms are Indiegogo and Kickstarter.


Indiegogo is an American crowdfunding website founded in 2008 by Danae Ringelmann, Slava Rubin, and Eric Schell. Its headquarters are in San Francisco, California. The site is one of the first sites to offer crowd funding. Indiegogo allows people to solicit funds for an idea, charity, or start-up business.


Kickstarter is an American public-benefit corporation based in Brooklyn, New York, that maintains a global crowdfunding platform focused on creativity and merchandising. The company's stated mission is to "help bring creative projects to life".


Business incubation programs are often sponsored by private companies or municipal entities and public institutions, such as colleges and universities. Their goal is to help create and grow young businesses by providing them with necessary support and financial and technical services. There are approximately 900 business incubators nationwide, according to the National Business Incubation Association.

Incubators provide numerous benefits to owners of startup businesses. Their office and manufacturing space is offered at below-market rates, and their staff supplies advice and much-needed expertise in developing business and marketing plans as well as helping to fund fledgling businesses. Companies typically spend an average of two years in a business incubator, during which time they often share telephone, secretarial office, and production equipment expenses with other startup companies, in an effort to reduce everyone's overhead and operational costs.

Not all business incubators are alike, however, so if you have a specialized idea for a business, try to find the incubator that best suits your requirements. If you're interested in finding an incubator in your state, visit the National Business Incubation Association's website. Or get in touch with your local economic development agency, located in the phone book under the listing for your state government. You can also call the information offices of your local colleges and universities to see whether they have any business incubation programs.


Accelerator The purpose of an accelerator is, as the name suggests, to help fledgling startups accelerate their growth. Before accelerators there were incubators; the first modern incubator that took equity in startups was IdeaLab, founded in 1996 by Bill Gross. Accelerators and incubators both provide varying levels of growth support to a startup, the primary difference between the two being that accelerators tend to have set program duration. The route into an accelerator typically starts with a competitive application process. Some popular accelerators have acceptance rates as low as 1%. The accelerator usually makes an investment (typically between $20K and $50K) in each of the selected starups (known collectively as a cohort). During the program term, they participate in educational programs and receive advice from mentors. Each member startup is expected to refine and validate its business concept and make rapid progress in developing a company, culminating in a demo day where presentations are made to investors. The combination of learning, connections, and the recognition for participating in an accelerator can help propel a firm forward. And the programs themselves can earn a lot of money if their graduated do well.

Y Combinator

The first accelerator was Y Combinator, started in 2005. Since then, accelerators have multiplied, and lately it seems that everyone and their brother has one. There are various specialized accelerators as well; some are restricted to female entrepreneurs, particular technologies, or social entrepreneurs. Some accelerators are associated with collaborative workspaces, and provide space as a benefit to the ventures. Others are associated with educational institutions, and provide access to and lectures by faculty.

From their perspective, VCs ae being asked to pay a premium (in therms of higher valuation) for startups that come out of a big-name accelerator. Are graduating starups really better, or just more polished? To the extent that “polish” is a key success factor, it may not e a meaningful distinction.

Term Sheet

The term sheet states all the essential terms and conditions of an investment or acquisition. It is not, however, the definitive legally binding document that an entrepreneur might ultimately sign, but predominantly a statement of intent. In particular, after the term sheet is signed, the deal is still subject to a due diligence review by the investor.

The initial term sheet for an investment will normally be proposed by the lead investor and negotiated with the entrepreneur. This complex document may include (among other terms):

• Investment: how much, in how many tranches, at what milestones

• Valuation (pre-money): how much the company is assumed to be worth under the deal

• Vesting: a schedule to be imposed on the founding team, even if they had been fully vested

• Liquidation preference: a multiple of the amount invested that investors insist on receiving upon liquidation or sale of the company, prior to payments to other shareholders

• Participation: for how long, upon liquidation or sale and after the preference is paid, the new shares should continue to be paid alongside those of other investors: either indefinitely (fully participating), until some cap after which other investors are given a chance to "catch up", or never (nonparticipating—don't fret for the investors in this case as they still have an opportunity to convert to common shares)

• Anti-dilution: protects the investor from new shares being issued at a lower price (a down round) by allowing, in that circumstance, their shares to convert to more shares of common; its most prevalent forms use weighted averages; full-ratchet protection, its most extreme, gives the investor the full benefit of the new price, preserving his or her ownership percentage

• Redemption right: allows the investor to reclaim their funds by selling their shares back to the company (considered dangerous for founders)

• protective provisions: specify actions requiring approval of preferred shares or of the Board

• No shop clause: prohibits the founder from entering into (but not necessarily continuing) negotiations with other investors for a fixed time period during due diligence; the investor will be under no such constraint

• Voting rights: at what multiple, if any, of an as-converted basis preferred stock may vote

• Drag-along provisions: force acquiescence of common shares to certain actions (such as a sale of the company) approved by preferred shares

• pay-to-play provisions: punish preferred investors in any of a number of ways for not participating in future investment rounds


Vesting is the practice of awarding equity (stock or stock options) in a startup to an employee or other contributor over a period of time rather than all at once. It serves to keep early employees (or co-founders) of a company from unfairly leaving with their equity before their proportional share of the work required to build a company has been completed. Vesting is a process through which one becomes invested in the firm.

Equity normally vests gradually over a period of time. Stock or options that have not vested are considered restricted. As equity vests, the employee receives control of more, and eventually all, of his or her shares, after which time the equity is said to be fully vested. Typically, this is formulated as reverse vesting—a vanishing repurchase right held by the company. While any at-will employee is subject to termination without cause, threats of firing someone unless their unvested stock is released are frowned upon. Some companies have provisions in their option plans for clawing back vested stock if an employee is terminated for cause.

Vesting cliffs enable a startup to offer equity on a trial basis. If the relationship ends by action of either party during the trial period, i.e., before reaching the cliff, then the departing party receives no equity. A typical vesting schedule is four years monthly with a one-year cliff, meaning that a quarter of the equity is vested after the first year and one forty-eighth of the equity is vested each month thereafter.

Vesting can be accelerated by certain events. Single-trigger acceleration requires only a single event, generally transfer of control of the company. For founders, double-trigger acceleration is preferred by VCs as it requires both transfer of control and termination without cause, making the company more attractive to potential acquirers who can apply funds to the purchase price that would otherwise have to be spent incentivizing fully vested individuals to stay. Regardless of the number of triggers, acceleration may be full or partial, depending on how much equity vests upon the event. "Full acceleration on exit" means that if the company is sold before the vesting period has ended, the remainder of the promised equity vests immediately. Advisors to the firm, having discharged their duty, will likely receive at least partial acceleration, while the founding team, being of more use to an acquirer, will likely receive at most that under a single trigger. Severance clauses may separately allow for partial or full acceleration upon termination without cause.


Equity is an ownership interest in a company, represented by a stock or other security. As such, it serves as a claim to the net assets of a firm and any future return the firm may generate. Equity in a startup can be issued directly or in the form of stock options. It can be sold in exchange for invested assets or granted as compensation for contributed labor (so-called sweat equity).

Stock is held in units called shares, but in startups people often think in percentages. A corporation may issue up to its number of authorized shares of stock to shareholders. Startups usually consider all issued shares to be outstanding and any recovered shares to be unissued, rather than treasury shares held by the corporation itself without exercisable rights. Startups often issue stock in only two classes: common and preferred.

Echoing a past boom, equity is the sand trough which people at every echelon in Silicon Valley now sift, seeking their share of gold.

Common Stock

Common stock has a high risk profile but always entitles holders (ultimately, the general public) to participate in the company's appreciation over the long term. It is typically characterized by one vote per share, proportional economic interest in the net assets of the company, and lack of additional or special rights. When offered to employees, it is often restricted, severely so in the case of stock that has not yet vested. This stock is non-transferrable, has no privileges, and is subject to effective reclamation by the company in the condition that it fails to vest, but is considered to have been issued to shareholders. Employee stock options may be similarly restricted.

Preferred Stock

Preferred stock is most often held by VCs and, occasionally, angel investors. It differs from common stock most fundamentally in that its holders receive distributions before holders of common stock receive similar distributions, and sometimes are entitled to additional distributions beyond those due to holders of common stock. Traditionally, preferred stock does not give holders the right to vote on company issues but in Silicon Valley it can Often be voted as if it had been converted to common, i.e., on an as-converted basis. The conversion is initially one-to-one, but may be more generous if preferred shareholders have increased their holdings relative to common due to anti-dilution provisions triggered by a down round.

Round of Funding

A round of funding, or investment round, refers to a grouping of investments in a startup over a limited time frame and under similar terms, each potentially including multiple investors.

Early in its search for external investment, the startup will likely approach friends and family or angel investors. Each such separate investment is considered seed funding, but the term seed round is often reserved for a syndicate investment involving angels and possibly early-stage VC firms, which may not intend to continue participating in all future rounds. The goal is to provide enough capital with which to assemble a management team, build a prototype, test proof-of-concept, or do whatever is most needed to propel the company toward milestones for attracting further investment. The seed round is usually kept small to avoid unnecessary dilution of the founding team, and because valuation at this point is still very subjective. A priced investment round is one for which a valuation is set and used to sell equity.

Subsequent rounds of funding are generally received from traditional VC firms, with intention and sufficient resources to continue participating (if the startup continues growing). The first of these is called the Series A round (or simply the A round or A-round), referring to the Series A preferred stock issued to the investors. At this critical stage, the startup will usually raise $2-10M for a 20-50% stake. This A-round is expected to last one to two years.

Successive rounds of funding occur based on evidence of continued development, viability, and validation via increasingly quantitative metrics of the startup's progress. In each such round, shares of a new series of preferred stock are authorized and issued, the capitalization table extended, and the letter incremented. Hence we arrive at a Series B round potentially followed by Series C, and so on.

None of this is intended as a precise formula. For example, either crowdfunding or strategic investment by established industry partners always possible. A corporate VC (the investment arm of a corporation—a VC in that it invests the funds of its shareholders, sometimes along with funds from partner firms) may make only strategic or more general investments.


A valuation is an appraisal of how much a company's equity (assets less liabilities) is worth (as opposed to what may be put forth on a balance sheet). No reliable methodology exists for determining the valuation of a startup, which by nature has limited history and an expectation of rapid growth. Estimating and discounting future cash flow (using notions of market size and exit value) can be useful, but these numbers can only be rough estimates based upon limited data and subject to market changes, including actions of new or existing competitors (as constrained by barriers to entry, which will enhance valuations). Basing the value on cumulative cash expended (on the theory that the company is worth what it would cost to recreate) rewards profligacy and ignores future potential. The conventional wisdom suggests using more objective (later) valuations of comparable companies in similar verticals and market position at similar stages of maturation, but not every startup can expect to walk in Facebook's footsteps. One can compare against similar companies at a similar stage, but that gets rather circular. All of the above and numerous heuristics may be used by startups and investors in deciding how to price the company's equity. Valuations of the company for stock option grants are generally more conservative than those for investment rounds.

What is clear is that the company will have considerably more assets after a major funding event than it had before. In specifying valuations around any funding event (round of funding), one must therefore distinguish the pre-money valuation (excluding the current investor's funds) from the post-money valuation (including those funds). Pre-money and post money are common short forms, and these are sometimes abbreviated further as simply pre or post.

The investment amount and either form of valuation is sufficient to determine the investor's ownership percentage. By injecting $2M into a company that was $3M pre, for example, the investor now has purchased a 40% stake in an entity valued, post-money, at $5M. Since the number of shares outstanding is arbitrarily determined, the pre-money valuation stands in for the share price, with the percentage ownership sold stated in terms of the post-money valuation.

While a valuation is implied by a funding event, when equity is issued for other purposes (including grants of stock options) the taxman will want the valuation to be backed up by a more formal appraisal process.

Capitalization Table

• A capitalization table is a table showing the equity ownership capitalization for a company.

• The capitalization table is essential for financial decisions involving equity ownership, market capitalization, and market value.

• Capitalization tables help private companies maintain the calculation of their market value. In the private market, they are also important for shareholder reporting and new capital issuance marketing.

The capitalization table (also called the cap table or cap sheet) represents the details of the startup's ownership and financings. Generally written in a spreadsheet or table format, the cap sheet provides a bird's-eye view of the relative standing of its investors with respect to each other and often also a significant historical record for the startup of how this situation evolved over time through the investments and valuations it has received.

The detailed capitalization table includes a row for each investor (including the stock option pool and holders of convertible notes) and columns for each class of stock and each series of each class. For each investor and series, it typically presents at least the number of shares owned, the investment amount, and the resulting ownership percentage represented by the holding relative to that series and class of stock, as well as relative to common stock on an as-converted basis. It thus serves to relate the collective impact of various investment rounds on the individual shareholders involved. In summary form, the cap table may omit numbers of shares and aggregate the holdings of each series of each class of stock, A pro forma capitalization table is a possible future cap table derived via calculations from the present one based on the terms of some possible agreement or other scenario, likely involving issuing new shares or repurchasing outstanding ones.

Liquidation preferences and other terms may separate the ownership percentages on the cap table from the actual payouts upon different liquidity events. A waterfall analysis, including liquidation preference charts (also known as liquidation curves), helps express those potential outcomes in terms of the effect of different exit values on each shareholder. The name refers to how the remaining value cascades through the various classes of stock as preferences are taken. The cap table is essential to understanding the dilution implications of decisions by startups, bankers, venture capitalists, and other investors regarding further financing or equity distribution to employees and their respective terms.

Although the reduction in the founding team's ownership percentage as the cap table grows may seem depressing, the cap table should also reveal the increasing worth of those shares.

Stock Options

Stock options have become an integral component of employee compensation in startups. Common wisdom dictates that employees who have chosen to work in a riskier situation than a more established company should participate in the firm is financial success. Moreover, this incentive can compensate for a startup's difficulty in paying salaries and benefits at market rates.

An employee stock option represents a right to buy the underlying stock for some time into the future (the duration), often up to 10 years, but exercisable for only three months after the termination of the holder's service to the company, at a predetermined acquisition price (the strike price, or exercise price). The objective is for the stock to increase in value such that the strike price will represent a substantial discount on the market price per share at the time of purchase, with the holder benefitting in proportion to that discount. When the strike price exceeds the current market price, the option is said to be underwater; such options are rarely exercised, for obvious reason. As with a direct grant of equity, stock options have significance based on the percentage of ownership potentially transferred. Employee stock options are subject to vesting and typically cannot be exercised until they vest.

Of the two types of stock options that a startup might grant, incentive stock options (ISOs) are the more avidly coveted. For ISOS there is normally no ordinary taxable income upon either grant or exercise—only capital gains upon the subsequent sale of the shares purchased through the exercise of the option. Non-qualified stock options (NSOs) are subject to ordinary income tax on the difference between the exercise price and the value of the stock when the option is exercised. ISOS are highly restricted and for employees only, while NSOs may be granted to employees, consultants, and non-employee directors•

Upon a financing round, investors may request that shares be set aside in a stock option pool for future hires. In what is known as an option pool shuffle, investors may request that the value of these shares be included in the pre-money valuation, so that they are granted at the founders' expense.


The portfolio refers to the combination (or pool) of companies in which the VC firm has invested its limited partners' money and those companies are referred to as portfolio companies, regardless of which fund the investments are part of. A VC's holding period for the companies in its portfolio will generally depend on the lifetime of the fund.

VCs may provide advice or support (wanted or not) to their portfolio companies, beyond their financial investment. Some of this may be provided by an entrepreneur-in-residence (EIR). Such individuals are held "in the stables" of a VC firm or other prospective investor and are expected to be open to taking a formal executive role within one of those portfolio companies or to accept funding for their own next company as appropriate.

Most VC firms will specialize in a particular stage of growth. Many specialize in a particular geographic area. Some VCs may specialize in a particular industry, while others diversify across industries. In either case, VCs will seek an appropriate balance in new investments, hitting their area of interest but avoiding excessive overlap with companies already in their portfolio.

VC firms take seriously the importance of maintaining deal flow, i.e., a steady stream of appropriate investments. While this is not a challenge for the most selective, top-tier VCs, for others it may be difficult. Often, VCs find new investments through contacts in their portfolio companies.

Investors may tend to focus their limited attention on the one or two startups in their portfolio on that appear most likely to become a unicorn. This is understandable, given that a big winner is crucial to produce the outsized returns needed to offset the losses wreaked by failed companies. Such returns may be reaped at an exit event, such as an IPO or an acquisition. A risky game indeed.


Financings come together for venture backing in numerous ways. Sometimes only one VC backs the deal, but if two or more entities invest in it, they form a syndicate. A single VC (the lead investor) may form a syndicate with angel investors, other VCs, or even seed-stage funds, and they jointly provide the capital. Usually all investors get the same terms, but occasionally Sweeter terms are bestowed on one or another via a side letter.

Especially at the seed round when the startup's risk of failure is greatest, syndicates can serve as a way of reducing that risk through diversification while giving the startup access to a greater amount of funding than any one investor alone is willing to provide. In subsequent rounds, issues of diversification and pooling of funds may still be at play. Furthermore, an investor from a prior round may wish to continue to participate and, indeed, it can look very bad for the startup if no previous-round investor chooses to carry on (a prospect known as signaling risk). Once a startup has achieved serious momentum, however, any number of investors may want a piece of the action.

There are also drawbacks to syndication for a startup. One partner backing out during negotiations would send a negative signal that could influence others. The smaller percentage ownership granted to each of multiple VCs could have adverse consequences such as crowding out new investors if things go very well or making it more likely existing ones will lose interest otherwise. Then there's the unpleasant prospect of in-fighting among syndicated investors.

AngelList has introduced a notion of syndicates among individual investors, where a lead investor entices backers to place funds under his or her management. This blurs the lines between angel investors, venture capitalists, and limited partners (although unlike limited partners, backers generally must approve each transaction and can decline to continue participating new deals at any time).


AngelList is a U.S. website for startups, angel investors, and job-seekers looking to work at startups. Created in 2010, the platform has a mission to democratize the investment process and to help startups with their challenges in fundraising and talent. It started as an online introduction board for tech startups that needed seed funding. Since 2015, the site allows startups to raise money from angel investors free of charge.

Burn Rate

• The burn rate is the pace at which a new company is running through its startup capital ahead of it generating any positive cash flow.

• The burn rate is typically calculated in terms of the amount of cash the company is spending per month.

• Gross burn is the total amount of operating costs it racks up each month, while net burn is the total amount of money a company loses monthly.

The burn rate of a startup refers to its negative cash flow—the amount by which its cash expenditures exceed the cash it takes in. This burning of cash typically goes along starting a company and getting an idea off the ground or not. Complementary to the rate a startup burns cash is its runway, the amount of cash remaining. As with a departing airplane, a startup must avoid exhausting its runway before lift-off. The burn rate is usually calculated as a monthly figure and provides an important indicator to management of how much time they have before they must either raise or earn more money, or close their doors. A startup that runs short on cash may also be able to arrange a bridge note (a short-term form of convertible note) from their VC, but it's never good to be desperate.

In a project management context, burn rate refers to net cash expenditures on a project relative to its budget. Rent, salaries, consultants, professional services such as attorneys and accountants, and conference attendance all contribute to the burn rate. The ultimate driver of costs is usually headcount.

Startups need to scale, and scaling requires spending. But as Fred Brooks made clear long ago in The Mythical Man-Month, you can't always solve a problem faster by throwing more people (thus money) at it. Every startup must decide up to what point it is getting its money's worth in a holistic way that recognizes both the bottlenecks strewn throughout the organization and the need for sustainability with respect to expected cash inflows. Bessemer Venture Partners' Byron Deeter breaks these down as: 1) Keeping the company structurally sound for the long term; and 2) Maintaining favorable unit economics.

Burning cash makes a startup feel like a much bigger business but, ironically, big businesses often wish they could feel like a startup. Resisting temptation and keeping the burn rate low, perhaps even lower than investors expect, takes discipline but can provide insurance for tough times and a cushion in timing the next funding round. On the other hand, any potential competitive advantage should be leveraged, and cash can provide such an advantage.


• A Planned Amortization Class (PAC) Tranche is a way of protecting investors in asset-backed securities from prepayment risk.

• PAC tranches accomplish this by using a collar based on a range of prepayment speeds to come up with a steady payment schedule in advance.

• While the PAC tranche reduces prepayment risk, reinvestment risk still remains an issue.

When a startup raises a round of funding, it often collects the funds not all at once but rather on a time schedule where additional funds are released only if certain milestones are reached. Those milestones may take the form of sales, user counts, entrance into new markets, or release of product features. French for "slices" or "portions," tranches are defined as the individual payments of invested funds.

Founders of a startup are likely to prefer investment rounds without tranches for the same reasons they might prefer long rounds (i.e., rounds expected to be widely spaced) to short ones; these let them concentrate on their work without distraction, enable them to commit to investments that will grow the firm, and create a unity of interests between founders and investors. Investors, however, may see it differently, desiring a way to limit their downside if the expectations upon which the investment is made are not realized, while locking in a right to double down if they are. Because startup founders are optimists, long rounds have the corresponding disadvantage vis a vis short ones of selling more of the company at a lower price; if that lower price is to be locked in, a round without tranches is clearly more desirable for the founders than one with them, which would give up the afore-mentioned benefits.

Startups may therefore be left with a trade-off between raising money through more investment rounds or through fewer rounds with more tranches in each. Because terms are renegotiated between investment rounds but set in advance for tranches, founders may consider that since they will be cut Off upon missing milestones, they might as well reap the benefit of better terms upon achieving them.

Perhaps tranches could be set with the price increasing along with the expected value of the firm. Regardless, tranched investments carry many disadvantages, mostly related to milestone selection, which may be arbitrary or ambiguous and risks locking the startup into plans that cease to be advantageous (the opposite of an Agile approach).

Down Round

• A down round refers to a private company offering additional shares for sale at a lower price than had been sold for in the previous financing round.

• Company valuation is subject to variables (failure to meet benchmarks, emergence of competition, venture capital funding) causing it to be lower than it was in the past.

• Down round could lead to lower ownership percentages, loss of market confidence, and negatively impact company morale.

A down round is a round of financing in which investors purchase stock from the company at a valuation lower than that of the previous round.

Down rounds result in dilution of ownership for existing shareholders, who retain a reduced percentage of equity. Dilution occurs any time a new shareholder purchases shares or one current shareholder purchases more shares than others, but in those common situations the current shareholders retain a smaller piece of a larger pie. In a down round, though, the pie is being recognized as smaller than the current shareholders had thought.

Since previous investors generally have anti-dilution clauses, which attempt to restore their allocation percentage after a new round that dilutes in this way, the founding team's portion of equity takes the re-balancing hit. As a result, the founding team may find their interest in the company greatly devalued, or even potentially worthless.

While no company wants to raise a down round, it can happen when valuations of the company at previous investment rounds seem too high at the time of a subsequent round. New investors aren't willing to come in at the previous valuation and insist upon a lower price. Examples of why this may occur include the initial valuation having been too high, the startup failing to meet expectations in the expected time frame between rounds, the startup having too high a burn rate for the amount of traction it is gaining so that it runs short of cash and is unable to hold out for a higher valuation, or a red-hot economy turning bad, causing overall re-evaluation of valuations across the wider startup ecosystem.

Although they create complications for both past and future investors, and are extremely demoralizing, down rounds can bring in the cash needed to give a company time to turn itself around.

Exit Strategy

• An exit strategy, broadly, is a conscious plan to dispose of an investment in a business venture or financial asset.

• Business exit strategies include IPOs, acquisitions, or buy-outs but may also include strategic default or bankruptcy to exit a failing company.

• Trading exit strategies focus on stop-loss efforts to prevent downside losses and take-profit orders to cash out of winning trades.

Intrinsically embedded in the concept of a venture-backed entity is the end goal of cashing out of the business in what is known as an exit. The startup may have as its goal either an IPO or an acquisition (buyout) of the firm, latter form of transaction is serviced by professionals in a sector known as mergers and acquisitions (M&A). The exit strategy refers to the startup's plan for how it wishes to exit. An unfortunate form of exit is the liquidation, i.e., dissolution, of the company, in which its assets are sold with the proceeds going to repay its creditors and any residual returned to its investors. Liquidation of a startup can be triggered by the VC's need to return funds to its limited partners.

The exit event is historically the only way for co-founders or VCs to cash out on their investment, as they are holding shares that are not publicly traded and as a result have been difficult to sell. Other liquidity events are, however, possible. One party may arrange to cash out, in whole or in part, as new money is invested or startup shares may be sold on newly arisen and controversial secondary markets. It is possible, but unusual, for the founding team and other investors to allow a VC to exit via a leveraged buyout (LBO), in which the firm becomes collateral for a loan used to compensate the exiting party. The root "liquid" here generally refers to the assets invested in the company becoming available to flow into other ventures, but one should be careful as there may be some ambiguity in who or what is becoming liquid.

The exit value is the monetary value that the company has, or expects to have, at the time of sale or IPO. It will vary depending on external economic conditions, which might favor IPOs or M&A. IPOs typically yield the highest return on investment, but many companies are happy to be acquired for a generally lower exit value, as the odds of a startup reaching an IPO are slim.

While founders may be unsure how they would exit, it is important to consider because differences with investors on this point can become the basis for conflict down the road.


In an initial public offering (IPO), a company lists its shares on a stock exchange for the first time, providing the public the opportunity to invest in the company's equity and growth in the form of publicly traded stock. From this point forward, institutional and individual investors will generally not interact directly with the company to invest capital but will instead buy and sell shares of the company via the stock market. Upon an IPO, preferred shares will generally convert to common stock and there will no longer be any qualitative difference between holdings of stock of different classes. The firm may subsequently issue additional preferred stock.

IPOs are often discounted so that the share price will shoot up on the first trading day (the first-day "pop"), making them a sought-after investment. IPOs, however, are not always sure-fire wins. In some situations, IPO stock has actually lost money from the opening trade price to the closing price of the first day.

Not that long ago in Silicon Valley, startups rushed to get their companies to an IPO. This was considered to be the best exit result and a validation of the company's success. More recently, the trend has been to push the IPO out, in some cases to avoid the costs, disclosures, and compliance responsibilities involved with public companies. Furthermore, the burgeoning trend of large companies purchasing promising startups as a source of innovation can provide an alternative exit strategy. Additionally, nontraditional investors like hedge funds and mutual funds have moved in to invest large amounts in pre-IPO startups or acquire them, easing the need to raise cash through an IPO. Finally, the still immature—but growing—secondary market for private shares allows employees of private companies a way to cash out, further limiting the motivation to go to IPO for liquidity. The resulting pressure on valuations, however, can limit future funding options for the startup.

The trend of forestalling IPOs may eventually recede, with some, such as author and investor Andy Kessler, warning that without the discipline Of the public market, sky-high valuations can become detached from reality.

Hype Cycle

VCs and entrepreneurs are both, to some extent, herd creatures that chase after whatever is "hot" at the moment, with the hype and excitement building to a frenzy when a promising new technology has been discovered. The hoopla among the early adopters and trend-seekers then dies down as people realize that the new technology is not a panacea, and as their interest moves on to the next big thing. The market research and advisory firm Gartner codified the predictable yet intoxicating life cycle stages of adoption and diffusion that make up this roller coaster ride in a graphical representation known as the Hype Cycle. Plotting expectations over time, it describes the following five stages:

Technology Trigger. As news of a technology breakthrough spreads, significant publicity and interest are generated. Early adopters investigate and companies that offer the technology proliferate.

Peak of Inflated Expectations. The hype surrounding the concept quickly reaches a frenzied peak. While there are early success stories, there are also failures, especially as use expands beyond early adopters.

Trough of Disillusionment. As users are brought back to reality by the realization that their greatest hopes for the technology may not be met, sentiment takes a plunge and interest wanes. Negative press begins. There is a shakeout of providers, and those that survive need to make improvements in their products. Many technologies and their purveyors stall at this point, failing to cross the chasm—the term coined by Geoffrey Moore to describe expansion from early adopters to mainstream markets.

Slope of Enlightenment. A second-generation of products appears, and later a third generation. Methodologies and best practices develop. Adoption grows, and expectations slowly begin to rise again.

Plateau of Productivity. Expectations have leveled out, and high growth mainstream adoption begins.

The concept behind the Hype Cycle curve has creeped into popular language use, with something that epitomizes an overexposed or clichéd trend referred to as "peak X."


Napoleon Bonaparte once said "A soldier will fight long and hard for a bit of colored ribbon." Today, companies are hoping that employees will work long and hard to get "badges" and other electronic equivalents of that bit of ribbon. The vehicle for doing this, called gamification, can be applied to affect the behavior not only of employees, but also of customers, students, and practically anybody else an organization comes in contact with.

It is said that motivation has four ingredients: having a clear goal or objective, knowing your progress toward that goal (your score), having control over the outcome, and getting rewarded for meeting your goal (preferably as quickly as possible). The effective combination of these elements is what makes video games so addictive, and indeed game designers carefully plan their games so that the balance and timing of challenge and reward draws in the player and holds their interest.

In gamification, game-like elements such as challenges, points, leaderboards, and levels are adapted for use by employees in the workplace, or by customers as part of their interaction with the company's offering. To apply gamification, a company will identify the behaviors they wish to reward (increased sales, healthy life habits, etc.), choose what rewards they can provide (at the level of resources, emotions, status, etc.), and design a system for granting those rewards. Rewards create a sense of competition, intended to enhance performance and engagement. Gamification has been used to promote employee wellness, customer loyalty, and goal achievement.

Gamification raises the question, however, of whether or not it is ethical to manipulate people, even to have them unwittingly act in their own interest.


< Technology>


Napoleon Bonaparte once said "A soldier will fight long and hard for a bit of colored ribbon." Today, companies are hoping that employees will work long and hard to get "badges" and other electronic equivalents of that bit of ribbon. The vehicle for doing this, called gamification, can be applied to affect the behavior not only of employees, but also of customers, students, and practically anybody else an organization comes in contact with.

It is said that motivation has four ingredients: having a clear goal or objective, knowing your progress toward that goal (your score), having control over the outcome, and getting rewarded for meeting your goal (preferably as quickly as possible). The effective combination of these elements is what makes video games so addictive, and indeed game designers carefully plan their games so that the balance and timing of challenge and reward draws in the player and holds their interest.

In gamification, game-like elements such as challenges, points, leaderboards, and levels are adapted for use by employees in the workplace, or by customers as part of their interaction with the company's offering. To apply gamification, a company will identify the behaviors they wish to reward (increased sales, healthy life habits, etc.), choose what rewards they can provide (at the level of resources, emotions, status, etc.), and design a system for granting those rewards. Rewards create a sense of competition, intended to enhance performance and engagement. Gamification has been used to promote employee wellness, customer loyalty, and goal achievement.

Gamification raises the question, however, of whether or not it is ethical to manipulate people, even to have them unwittingly act in their own interest.


A massive open online course (MOOC) allows open access, via the Internet, to an unlimited number of participants around the world. of the first successful MOOCs were Andrew Ng's class on machine learning and Sebastian Thrun and Peter Norvig's class on artificial intelligence in 2011. MOOCs became widespread by 2012 (which the New York Times called "The year of the MOOC"), propelled by startups that Ng and Thrun were involved in founding.

MOOCs provide instruction and problem sets suitable for distance learning. Many also provide forums that support interaction between students, professors, and teaching assistants. MOOCs have the potential to engage students and customize education to their individual styles and needs in a way not possible in traditional lectures.

Despite the benefits and ease that MOOCs offer, they have many detractors. Some argue that website lectures cannot replace the unique experience of sitting in a classroom. The quality of MOOCs often varies dramatically. And some have noted the low completion rates of many MOOCs. Although many MOOCs are free, some charge for their services. Organizers have found that charging a fee increases student commitment—and thus completion rates. Free MOOCs may be financed by corporate sponsors.

Contrasting with MOOCs are SPOCs. A SPOC is a small private online course. It is often aimed at on-campus students. As with MOOCs, students using SPOCs work at their own pace and learn interactively through online lectures and labs.

Appropriate roles for traditional universities and technological solutions are being discussed and experimented with throughout Silicon Valley, but it seems uncontroversial that improved access to lifetime learning is essential.

Sharing Economy

- Definition

"A shared or sharing economy is an economic system in which assets or services are shared between peers or businesses for free or for a fee. The concept is to enhance the usability of assets, making their lifespan more worthwhile."

The sharing economy is an economic model defined as a peer-to-peer (P2P) based activity of acquiring, providing, or sharing access to goods and services that is often facilitated by a community-based on-line platform.

<Main Drivers>

- The spread of advanced digital platforms and devices

- Efforts to use material resources more efficiently, economic rationality

- New consumer needs - closer cooperation and a change in attitudes to ownership, more environmentally friendly consumption choices

- Social Changes - globalization and urbanization

<Main features of sharing economy companies>

- Sharing-based

- Idle capacities, resources

- On-demand access

- A higher degree of personal interaction

- Drive towards sustainability

The sharing economy is an expanding business model built on the idea of collaborative consumption, i.e., people relying on each other in obtaining access to merchandise. The core of the model is the consumer renting underused assets from other consumers, rather than purchasing new ones. First identified in the mid-2000s, it was galvanized by the radical expansion of social media, the development of the mobile Internet, and a growing sense of urgency regarding natural resource depletion.

This community practice has transformed into a very profitable business model for those running such online marketplaces, and in some cases for the "sellers." These brokers, utilizing both the Internet and mobile apps, have increased efficiency, reduced transaction costs, and made it possible for anyone to advertise their property and services. Prominent players include Airbnb (renting your home), RelayRides (renting your car), JustPark (renting your parking space), DogVacay (housing people's pets while they are on vacation), and Lyft and Uber (driving your car for others).

As these latter examples indicate, the sharing economy tends to blur into the gig economy, whereby people are enabled via web sites or mobile apps to work as contractors in place of steady employment.

Traditional competitors have raised concerns that such companies have an unfair advantage because they often do not comply with regulations that protect the public—and themselves, to varying degrees; in many cases legacy firms have challenged sharing economy firms in court. It appears likely, however, that the sharing economy is here to stay.

・Hopping Abroad the sharing economy (BCG)

・The sharing economy research by PWC

・4 big trends for the sharing economy in 2019 (world economic forum)





SoLoMo, a portmanteau of social-local-mobile, represents a model for connecting mobile users with local commerce based on their location and social media activity. The growing popularity of the smartphone influenced and allowed for the emergence of this trend. Unlike the PC, which can be located only by its IP address, the smartphone can be tracked by more precise GPS coordinates that are exchanged with applications. Furthermore, these apps make mobile phone users' social media activities and interests available for analysis that can yield further location validation and a more personalized experience. This combination of technologies provides access to the huge market of small local businesses that is difficult to reach by other means. From the perspective of local businesses, it finally lets them stand out to ever-more-connected users where they have the strongest competitive advantage—literally in their own backyard.

Search engines have acknowledged the significance of the shift in search technology. By employing geolocation technology (which identifies the geographical location of a device by means of electronic signals) and social media history, many search engines now tap into the previously untouched local market, presenting local businesses over more generic options. It's been estimated that 80% of mobile users prefer localized search results, and 75% of users are more likely to take action if a local option is presented.

The emergence of SoLoMo represents a complete paradigm shift from the tradition of publishing standard messages to all users. Instead, relevant content leveraging local offerings is presented by determining a user's location and activity on social media. Forrester analyst Jennifer Wise argues that the SoLoMo vision is too limited, and that what is really needed is to analyze a person's entire context and to access them at whatever device they happen to be using. Applications should be built from the ground up following these principles in order to obtain higher levels of user engagement.

Big Data

The ubiquity of sensors and decreasing cost of storage have led to an explosion of data. This big data, and concomitant leaps in analytic tools and storage technology, create an unprecedented opportunity for applications to provide users with value, for companies to better understand their business—and for individuals billing themselves as data scientists. One common use of big data, predictive analytics, aims to micro-segment consumers and predict their behavior. More generally, machine learning algorithms react to patterns in data rather than explicit instruction to provide appropriate responses.

Big data has multiple defining characteristics, all chosen, by convention, to start with the letter 'V.' The initial three were proposed in 2001 by META Group analyst Doug Laney. A data set may exhibit one or more of these characteristics to be considered "big data." First, big data has a very large volume, meaning that there's a lot of it. It may also have high velocity, meaning that it's generated at great speed. John Joseph, then of Lavastorm, notes that volatility, emphasizing the speed at which data changes or expires, is a more appropriate trait. The variety of big data can have several interpretations. When data occurs in the form of numerous attributes, it provides both a challenge to determine relevance and an opportunity to benefit from it. Laney's interpretation indicates that data can take many forms. The broadest categorization is between structured data, organized by well-defined types (as in a traditional database) and unstructured data, such as text or images, whose minimal structure provides little support for information retrieval. The challenge of data variety comes when data in different formats must be harmonized. Many other characteristics of big data have been proposed (and many derided by analytics strategy consultant Seth Grimes as "wanna-VS"). Among these, veracity, i.e., that big data may come from untrustworthy sources, was proposed by IBM. Variance, the range of possible values, technically folds into volume. Variability, seen as how much of the data is subject to change (thus volatility), can be relevant to maintaining consistency. The biggest challenge of big data is how to use it appropriately and intelligently. Tools that aid in data visualization can help with that. But for many people, the greatest concern with the big data accumulating in corporate and government databases is their own privacy. Questions such as "Whose data is it anyway?" must be addressed.

Cloud Computing

Cloud computing is the deployment of a network of servers to allow users to upload data and export processing. An external cloud is housed on remote servers, reducing the need to store and process information on local computing devices. To alleviate security concerns, some companies prefer to implement internal clouds on their own premises and within their rate firewall. External clouds may be public, with hardware allocated among multiple, unrelated tenants, or private, with it dedicated to a particular organization, while internal clouds are generally private. In a community cloud, users pool computing resources. A hybrid cloud combines these mechanisms, allowing more control over local resources, but also affording an opportunity to supplement these with external resources.

The three primary instantiations of cloud computing export, ordered by decreasing comprehensiveness, software applications (software as a service, SaaS), platforms to create, run, and manage such applications (platform as a service, PaaS), and computing resources (infrastructure as a service, laaS). A recent addition exports analytics processing (analytics as a service, AaaS). In any of these cases, processing is said to take place in the cloud, as predicted by visionaries including J.C.R. Licklider and Prof. David Gelerntet.

Because cloud computing customers pay on an on-demand basis, they avoid upfront infrastructure costs and can economically adjust their use to meet fluctuating needs. Furthermore, cloud services allow companies to outsource much of their IT management work and focus on the projects that differentiate their business. Cloud technologies promote collaboration by ensuring software compatibility and accessibility of data, and make it easier for employees to work remotely. The cloud's reach makes possible many new business models and, importantly, it has lowered the cost of starting a business, thus promoting entrepreneurship.

The next frontier, alluding to a dispersed cloud, is fog computing, in which some data and processing are pushed towards the edge of a network, occupied by devices, including users' mobile devices, that sense or modify their may provide efficiencies deemed essential for the Internet of Things and enable enhanced privacy of the localized data.

Software as a Service (SaaS)

Software as a service (SaaS) is a software distribution model in which a vendor or service provider hosts applications over the Internet and makes them available to customers. Software is centrally hosted and licensed on a subscription basis. This is the most common form of cloud computing, whereby the organization's data and much of its processing reside elsewhere.

SaaS relies on underlying technologies such as web services and service-oriented architecture (for structuring applications) and asynchronous data transfer (for responsive user interfaces), and has benefited from the maturation of those technologies. Centralized hosting of business applications dates back to the 1960s but was interrupted by the PC revolution. The expansion of the Internet in the 1990s supported a new class of centralized computing that eventually developed into SaaS, which now plays a role in office messaging systems, CRM software, payroll processing software, and many other applications.

Software as a service has two business models: hosted application management and software on demand. They differ only in whether the hosting company is also the software developer.

This developing technology presents several benefits. Software as a service supports easier administration, automatic updates and patch management, better compatibility among users, who will be on the same software version, and global accessibility via the web. It also allows the customer more flexibility in deciding what software to use and when.

Although SaaS is likely to remain a primary means of software delivery for some time to come, the more traditional approach of selling a license to download and use software is making a resurgence in the form of an app economy for the iPhone and Android mobile platforms.


In broad terms, a platform is an underlying computing environment with distinct rules and protocols that other technologies (software, hardware, applications, cloud computing, operating systems, etc.) must conform to in order to run properly. A platform can itself be hardware or software. A newer usage of "platform" refers to a web site or mobile app that brings multiple parties together to complete transactions, as in a multisided business model. To some, it seems, pretty much any application is a platform.

"Platform" can refer to a single system or a set of layered facilities provided to aid in software development (a technology stack). Common components are an operating system, a web server, a database, and a programming language. Platforms can be accessed through the cloud via PaaS. Cross platform technology involves the implementation of a common higher-level platform (such as Java bytecode) on multiple lower-level platforms. This allows a developer to reach users of those lower-level platforms while interacting only with the higher-level one (the traditional notion of compilation).

For a technology company, creating a platform is a way to empower users. Rather than predicting what specific problem customers are likely to encounter and providing them with a specific solution, the company can provide a form to enable an ecosystem of developers to compete among themselves to satisfy end-customer needs, all built upon a common solution infrastructure. This gives incredible reach to the technology underlying the platform.

A web browser, for example, can be thought of as a platform that executes pages defined in HTML, JavaScript, etc., allowing any number of web function. Modern browsers are often platforms in another sense: they extended via add-ons. Thus, the organization creating the browser need longer ensure that every service that every user might desire is present, only that it is possible for outside developers to create such services.

Internet of Things (IoT)

The Internet of Things (IOT) is a vision of a network with extremely broad reach resulting from the proliferation of new connected computing devices, or the enhancement of everyday objects with computation and connectivity as well as with new sensors and actuators. These objects can thus transmit large amounts of information to each other, opening up new possibilities for their intelligent and coordinated operation and potentially limiting the need for human involvement in their maintenance.

The 10T had been under development since the early 1980s. The first Internet connected appliance was developed at Carnegie Mellon University—a Coke machine able to report inventory levels and drink temperatures. Since then, the 10T has progressed into smart grid (with connected appliances, meters, power generators, and storage cells), connected vehicles (to avoid collisions or coordinate with traffic signals), supply chain management (monitoring the freshness of merchandise or the conditions that it experienced), medical devices, environmental monitoring, etc. One recent application is a solar-powered garbage can that can announce its remaining capacity to sanitation trucks. The 10T is expected to take big data to another level as sensors become more ubiquitous.

While the Internet of Things promises a world in which our every need is anticipated, many have raised criticisms regarding the concept. Privacy concerns over user consent, freedom of choice, and anonymity have been voiced as the system has grown to encompass many diverse fields. Security also poses a serious threat, as the 10T may connect and thus expose more critical systems, making them vulnerable to hacking. For these reasons, in addition to the drastically increased number of connected devices that must be handled, an over-haul of the tried-and-true Internet may be called for.

Internet of Value

A cryptocurrency is a medium of financial exchange playing a similar role to other currencies, but entirely virtual. For purposes of security and anti-counterfeiting, encryption techniques are used to regulate the generation of units of currency and the transfer of funds. Decentralized cryptocurrencies are not managed by any authority, such as a country’s central bank, a feature appealing to civil libertarians.

Developed in 2009 by the mysterious "Satoshi Nakamoto," Bitcoin was the first (and remains the best-known) decentralized cryptocurrency. Bitcoins can be generated by running a "mining" program that solves complex math problems, producing a proof-of-work with which it is easy to confirm the calculation. The mining process also validates transactions, converting them, along with the proof, to a block, which is added to a distributed public ledger called a blockchain. The blockchain is constructed so as to ensure that transaction history is preserved. Inflation is averted by the difficulty of performing the work, rather than the self-control of any government. The Bitcoin software is set to eventually cap the number of Bitcoins in circulation.

It is now recognized that the applications of the blockchain's distributed ledger technology (DLT) go well beyond currencies. Much as U.S. bills once represented gold, a customized cryptocurrency can stand in for any asset, a110'.sing for a secure and efficient exchange framework supporting numerous forms of transaction, referred to as the Internet of Value (IOV), with the potential to disrupt firms providing escrow' services (e.g., for collectibles and real estate) or ensuring integrity of supply chains. Cryptocontracts are computer programs that automatically execute the terms of a cryptocurency-denominated contract.

The proof of a currency's utility is whether it can actually be exchanged for goods and services. Cryptocurrencies have thus far been plagued by extreme volatility and criminal activity. Current cryptocurrencies are limited by the inefficiencies and risks inherent in "mining," the need to distribute the entire blockchain, and the lack of a way to manage identities. Low transaction costs and numerous potential applications, however, suggest a bright future for these technologies.

Virtual Reality

Virtual reality (VR) replaces sensory data with generated content intended to seem real. This has been a philosophers' dream since Hilary Putnam's "brain in a vat" thought experiment (itself inspired by René Descartes' "evil demon"), and technology is finally catching up.

The term was coined by Jaron Lanier, who did early VR research. A main challenge for the technology is to generate incremental changes to the scene quickly enough in response to a subject's movements that the viewer does not get nauseous. All eyes are on Oculus (whose Rift is the premier VR product), which was initially crowdfunded, then venture-backed, and eventually purchased by Facebook—all prior to product release. Virtual reality technology tantalizingly promises to bring fantasy to life.

Computer games have often beckoned us to blur the lines between the real and the unreal. Virtual worlds go a step further in simulating or recreating real-world features such as people, organizations, and establishments, possibly along-side fictional or historical ones. In a virtual world, multiple "players" can interact via representations called avatars. Second Life is the most widely-used virtual world.

Augmented Reality

Augmented reality, i.e., real-time superimposition of computer-generated information onto an actual scene, is a form of mediated reality, which more generally changes (adds, modifies, or removes) sensory inputs. In the simplest applications, a user can be presented with information in textual or numeric form; in more complex applications stored image templates can be adapted to the user's frame of vision. It may leverage advances in eyewear such as Google Glass or Microsoft HoloLens to, for example, provide information about people in view using facial recognition technology or enable consumers to virtually try on clothes, cosmetics, or haircuts. Enterprise applications, which by their nature involve fewer privacy concerns, may come first. While virtual reality may allow student surgeons to perform simulated operations, augmented reality may enhance actual operations by allowing surgeons to visualize what lies beneath the skin.

Semantic Search

Semantic search is an information-searching technique that goes beyond comparing keywords in attempting to determine the intent behind the search query entered by the user, so as to generate more relevant results. It may be used for searching within closed dataspaces or on the Internet.

Semantic search is based on semantics, the science of meaning of language, and may make use of natural language processing (NLP) to Understand both the search query and the corpus being searched. One challenge here is word sense disambiguation, i.e., determining the most probable intended meaning from all possible ones. Conversely, a search related to a given concept should return results without regard to the specific phrasing used. Another challenge is entity and relationship extraction, the determination of what real-world people and things are being referenced and how they relate.

Meeting these challenges clearly requires substantial real-world knowledge. A semantic network can encode knowledge about the real world (things and relationships and the concepts that can categorize them) and may associate that knowledge with language syntax. The Semantic Web is intended to be a common repository for such content, tied in to the Internet. It may finally come to fruition through the accessibility of linked data, i.e., an association of entities with web addresses that could give web pages access to this information. Such a framework is seen as necessary infrastructure for the Internet of Things and cryptocontracts.

Semantic search may also take into account user context—the time and location, the activities the searcher is engaged in, the involvement of any other individuals, etc., any of which may provide insight into the searcher’s intention. Search engines are continually being enhanced so as to provide more benefits of semantic search. At the limit here are recommendation engines, in which the query is not supplied by the user at all but instead is derived from their behavior and context.

Artificial Intelligence

Although the search for artificial intelligence (Al) did not originate in Silicon Valley, it has taken hold here and evokes visions of the Singularity. Automation of tasks traditionally requiring human involvement under-pins many of the most exciting technologies being developed here today, including voice-activated virtual assistants, robotics, and self-driving cars. It is easy to make a system seem intelligent if the range of inputs it is expected to consider is tightly constrained (e.g., a chess board). The real challenge of Al is robustness, i.e., systems broadly defined to across a variety of situations, formalized as domains. The quest for artificial general intelligence (AGI) pushes the minimization of domain assumptions to its limit, aiming for systems that can operate freely in human contexts.

Broadly inspired by a human sense of our proudest accomplishments, "good old-fashioned artificial intelligence" (GOFAI) systems access symbolic representations (models or maps) that extract essential features of some real-world domain. They may use an ontology to define characteristics of and associations among the most relevant entities in the domain. Some hope to achieve robustness by harmonizing systems created independently for different purposes, translating between their distinct representations, perhaps using an upper ontology general enough to describe the common features of entities and their interaction across all domains. Knowledge organized in this way can be made available through the Semantic Web.

The more humble, trusting approach of statistical machine learning devotes programming effort only to building algorithms that can be trained to recognize and predict known patterns or even discover anomalies. Such systems, often implemented on connectionist architectures such as neural networks, eschew representations in place of direct contact with the environment. One disadvantage is that logical explanations for their recommendations are often unavailable.

Appreciation for the power of the dynamically constructed and statistically derived behaviors of machine learning, and especially the layered approach of deep learning, is although the promise of understandable and verifiably correct systems remains appealing. Is it possible to build systems that apply logical reasoning over richly interrelated concepts in a traceable ultimately on the basis of machine learning algorithms? Our very existence appears to demonstrate that it is. On the other hand, just as we use language to communicate each other, semantic representations may remain the best hope of connecting people computer systems and even computer systems each other. As in the human case, there is some question of whether a single lingua franca is possible or even desirable.

Autonomous Vehicles

Autonomous vehicles, in the case of land vehicles aka robocars, are self. driving, i.e., they sense their environment and operate without human input. Aerial vehicles can also be autonomous, although teleoperated drones (controlled remotely) are not. Self-driving cars were viewed as science fiction only a decade ago, but many automobile manufacturers are now planning them. Autonomous vehicles are an endpoint on a continuum through which the burden on the driver is being reduced.

Self-driving cars may maintain maps and even update them based on sensory input that they gather. Radar, GPS, and computer vision are utilized to help the car sense its surroundings. Control systems interpret sensory information to identify roads, obstacles, and signs.

Seemingly opposite technology relates to connected vehicles—vehicles able to communicate with each other (vehicle-to-vehicle, V2V) or with stationary facilities such as traffic signals (vehicle-to-infrastructure, V21). It can be used to subordinate driver input to such communications, voluntarily or otherwise—for example, to prevent cars from crossing red lights or to coordinate the distance between vehicles. This technology informs the vision of intelligent transportation systems (ITS). The communication can be provided either as dedicated short-range communication (DSRC) or through Internet access, perhaps via wireless local area networks relayed between vehicles. Telematics systems leverage inter-vehicle communications to provide services including enhanced navigation, in-vehicle entertainment, and vehicle tracking.

Ethical and legal issues abound, such as who's liable for accidents, what values should underlie vehicle training, cybersecurity concerns of vehicle hacking, and privacy implications of third parties knowing everywhere we travel. Practically, consumers may be forgiven for wondering whether their cars will crash as frequently as their computers currently do, with more serious ramifications. Still, it is expected that this technology will save many lives. There is ongoing debate over the extent to which future vehicles will be autonomous or connected, but in one respect these points are not opposed, Whether vehicles act on their own or in conjunction with each other and the roadway, the human driver can be expected to play a diminishing role.

Synthetic Biology

Sometimes it seems like practically everybody in Silicon Valley is a software engineer. We are surrounded, by devices whose resilience and scale exceeds anything that software engineers can direct or control and whose programming languages only beginning to appreciate—biological systems.

Synthetic biology applies engineering principles to the fundamental components of biology in order to design and create new life. Its goal is to design biological systems by creating novel artificial biological pathways, organisms, or devices, or by redesigning natural ones. The CRISPR technique, derived from a defense system of strep bacteria, has made gene editing more accessible. These engineered biological systems could be used for purposes such as processing information, manipulating chemicals, fabricating materials and structures, producing energy, consuming environmental toxins, providing food, and maintaining and enhancing health. A holy grail would be to harness photosynthesis from plants to produce energy.

An MIT-based effort is underway to standardize DNA segments in a parts registry so that they can be reliably combined to produce predictable results, Such parts can be extracted from simple organisms in which they are well understood or developed using directed evolution, which mutates DNA and selects that which produces desired proteins or nucleic acids. Synthetic biology is complicated by differences in individual genomes as well as by findings of epigenetics that environmental factors influence behavioral traits through regulation of gene expression.

Although synthetic biology offers extraordinary promise, there are many risks. Manipulating life forms, even with the best intentions, could have unintended consequences; the technology could be weaponized to intentionally inflict harm. Many have ethical concerns about tinkering with life forms' especially germline cells, whose modifications are passed to offspring. Evolution can be steered towards such modifications by use of a gene drive Attention to the long-term implications of this work is called for.

CRISPR technique

CRISPR technology is a simple yet powerful tool for editing genomes. It allows researchers to easily alter DNA sequences and modify gene function. Its many potential applications include correcting genetic defects, treating and preventing the spread of diseases and improving crops. However, its promise also raises ethical concerns.

Brain-Computer Interface

A brain-computer interface (BCI), sometimes called a brain-machine interface (BMI), direct neural interface (ONI), or, to highlight the exotic implications, synthetic telepathy interface (STI), allows for direct communication between the brain and an external device. BCls are directed at assisting, augmenting, or repairing human cognitive and/or sensory-motor functions. The interfaces of any processing system, including the brain, are sensors and actuators, so it should not be surprising that these applications primarily take input from or provide instruction to prostheses that provide hearing, sight, and, once trained to respond to electrical impulses in the brain, movement. Likewise, due to the brain's cortical plasticity, the BCI can handle signals from the implanted interfaces as if they were natural sensory organs. In addition to working with prostheses, BCIs can bypass damaged nerve tissue to reach functioning body parts (as for example, in treating Parkinson's disease).

One goal of research in this field is to reduce the invasiveness of these devices. Unfortunately, for some purposes it is still necessary to place electrodes beneath the subject's skull. Another challenge involves accessing the appropriate signal without interference. Progress is dependent upon continuing growth of knowledge of the brain and how it represents information, using techniques such as functional magnetic resonance imaging (fMRI) and diffusion tensor imaging (DTI). By analogy to the Human Genome Project, the Human Connectome Project looks to build a neural map of the brain.

Beyond restoration of human limbs and sensory organs, the technology could potentially be used to interface with any mechanical device, such as a keyboard or computer screen, or perhaps an Internet-connected computer. More nefarious applications, however, could intercept and interpret these electrical impulses to read thoughts or even generate them to implant false memories. Fortunately, due to the complexity of the brain, such developments are not yet realistic—but one might be alarmed and/or thrilled at how far we have already come.


The genesis of the field of nanotechnology is credited to Richard Feynman, who in 1959 postulated a series of machines constructing ever smaller machines. Nanotechnology is the manipulation of matter on a scale of at least one dimension from I to 100 nanometers (nm). A nanometer is one billionth of a meter, about the width of three or four atoms (the average human hair is about 25,000 nm. wide). It is now possible to image, measure, model, and manipulate matter at nanoscale. Nanotechnology leverages the unusual physical, chemical, and biological properties exhibited by substances at the nanoscale. Microelectromechanical systems (MEMS) are of dimensions a thousand times larger than nanoelectromechanical systems (NEMS). Soft nanotechnology is a convergence of synthetic biology and nanotechnology, inspired by the amazing machines in our cells.

Obviously, nanotechnology enables the creation of objects that will operate in very small spaces; an example would be delivering drugs within small blood vessels. Perhaps less obviously, it becomes possible to change the properties of materials by changing their internal composition. Materials can be made stronger, lighter, better or worse conductors of heat or electricity, more or less smooth, and/or more or less reactive. Properties such as magnetism and light reflectance can be modified. The possibilities for remaking existing products and imagining new ones are numerous.

Eric Drexler prognosticated precise molecular manufacturing (aka nanofacturing or nanofabrication) via self-replicating nanorobots (aka nanobots) constructed bottom-up from smaller components, in contrast to Feynman'S top-down approach—a vision now referred to as molecular nanotechnology (MNT). Drexler first stoked and later quelled fears of these devices turning the universe into grey goo. Other concerns about nanotechnogy include the impact of nanopollutants on the environment and the invasion of privacy through virtually undetectable surveillance devices. Yet nanotechnology presents profound opportunities for disruption, as has been promised for decades. Its time may be arriving.

The Singularity

Computing pioneer John von Neumann remarked about "ever accelerating progress in technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity ... beyond which human affairs, as we know them, could not continue." Ray Kurzweil more recently popularized the Singularity, as a "merger of human technology with human intelligence," viewed as an inevitable consequence of general exponential technological growth, along the lines of von Neumann's usage, Indeed, we have experienced disruptive technological progress at ever-shortening intervals. Still, it seems unlikely that any particular exponential increase in technology (including Moore's Law) can be sustained indefinitely

But the eventual ability of our computer programs to improve themselves would itself lead to exponential growth. Such an intelligence explosion, as posited by Irving John Good, would enable our computer programs to rapidly accelerate in effectiveness beyond human capabilities and beyond any human ability to foresee or control. The word "Singularity" is an implicit reference to gravitational singularities (as in black holes) and their inherent inscrutability. Likewise, as expressed in Vernor Vinge's event horizon thesis, we cannot envisage a world designed by those so much more intelligent than ourselves.

Parallels with end-of-times theology are considerable, including disagreements over both when the Singularity, the rapture of the nerds, can be expected and whether it will be a doomsday scenario or an exhilarating one, Some recommend mechanisms to avoid situations of "runaway" intelligence Others foresee a merging of organic and inorganic intelligence (involving BCI) rather than a competitive strain of intelligence. Those living more in the present emphasize the serious, immediate implications of accelerating technological progress: severe inequality between those who can apply it productively and those who cannot.





bottom of page