Skip to content
Subscribers Only
Investment Alerts

The Power Of Logic In Investment Success

August 26, 2025

The sooner investors realize we are amid one of the great bull markets of our time the better positioned they will be. Rarely in history has there been such a powerful confluence of bullish forces, and in periods like this before, the returns for equity markets were truly exceptional.

During the introduction of new technologies such as the radio, railroad and internet, each innovation unleashed waves of productivity, profitability, and investor wealth. Today, we are in another transformative era, defined by major technological breakthroughs in the form of Artificial Intelligence, digital assets and even more recently, quantum computing.

The last few years have already been quite good in the stock market, but I believe in the months and years ahead there will be an acceleration of gains. And technology isn’t the only driver. The political and monetary backdrop is also turning supportive. A pro-business administration combined with an approaching interest rate cutting cycle provides a tailwind for both risk appetite and corporate earnings.

The surge in new IPOs this year is another clear signal of a bull market, as companies and investors alike often rush to the public markets when optimism and liquidity are running high.

While the broad market should perform very well in the years ahead, the real opportunity lies in identifying the sectors and companies that will deliver outsized returns. In this piece, we’ll explore which areas of the market look most compelling right now, and how investors can position themselves to take advantage of this unfolding bull market.

Stocks to Watch: AI, Digital Assets, and Quantum Computing=

Just as past industrial and digital revolutions reshaped the economy, today’s breakthroughs in AI, digital assets, and quantum computing are creating entirely new industries while driving productivity, efficiency, and long-term growth.

AI adoption has been staggering, OpenAI alone now has over a billion weekly users and the competing LLMs [large language models] have nearly as many. So far in the public markets, the real winners have been infrastructure providers: GPU and custom chipmakers, data center service providers, and energy suppliers. With Hyperscaler spending on AI infrastructure expected to surpass $1 trillion in the coming years, there will be tidal waves of capital flowing into this sector.

Digital assets are also transforming markets. Bitcoin has reasserted itself as the top performing macro asset, surging more than 100% in just the last 12 months, while tokenization and stablecoins are changing how capital moves. Tokenization, still in its infancy, will allow 24/7 trading, instant settlement and numerous other innovations, while stablecoins provide a new layer of liquidity that enables faster capital redeployment across markets.

The adoption of these financial innovations are accelerating with Tokenized Treasuries having already crossed $7B in market size this year, and Stablecoin settlement volumes now exceeding Visa’s annual payment volume, accounting for over $35 trillion in settlements between February 2024 and February 2025.

While Bitcoin remains the purest and most powerful way to gain exposure, brokerages offering crypto access and firms pioneering tokenized assets present equally compelling opportunities.

Quantum computing, once distant science fiction, is now entering commercialization. Advances in qubit coherence, error correction, and hybrid quantum-classical systems are opening real use cases in logistics, biotech, and materials. Stocks in the space have surged as investors begin to price in the massive long-term potential, though the industry is still relatively very small. Researchers forecast that the total quantum computing market will grow at more than 30% annually through 2030.

Together, these forces are not only fueling speculative enthusiasm but attracting real capital flows and positioning them as defining themes of this bull market.

Political and Economic Tailwinds for Equities

One of the most powerful forces behind equity bull markets is liquidity. Lower rates reduce borrowing costs for consumers and businesses, improving corporate earnings and encouraging new investment. Historically, markets tend to re-rate higher during these periods as valuations expand and investors are emboldened to take on more risk. Nearly every major bull run of the past century coincided with falling rates.

Beyond monetary policy, the political and regulatory backdrop is also supportive. The current administration has leaned heavily into business incentives, infrastructure investment, and targeted support for high-growth sectors like artificial intelligence and domestic manufacturing.

The surge in new public listings is another hallmark of a bull market. Investor appetite for IPOs has returned in force, with 219 IPOs on US exchanges so far in 2025, up 87% from the 117 IPOs by this point in 2024. High profile debuts such as Figma, a design platform, and Bullish, a crypto exchange, highlight both the diversity of sectors coming public and the confidence of private companies in tapping public markets. Historically, IPO booms have been closely tied to strong equity cycles, reflecting both abundant liquidity and investor optimism.

Together, falling rates, pro-business policy, and a vibrant IPO pipeline form a powerful set of tailwinds.

Ethan Feller, Zachs, 23 August 2025

This guy agrees with me, but what he says makes sense. The conditions are in place for a massive bull market.

I have also thought of a key difference between the AI/ data centre boom and investment booms like the 19th-century railway boom and the late 20th-century Internet boom. Both those booms saw massive amounts of new equity capital raised to finance the investment spending. This time, much of the spending is being funded from cash flow with no new equity being raised. The potential for this spending to deliver high returns is consequently greatly increased.

It creates a formidable virtuous circle. Companies invest massively out of free cash flow in data centres/ AI factories, generating trillions of valuable tokens. This raises cash flow even further, which can again be profitably invested.

Let’s have another go at explaining tokens.

Under the hood of every AI application are algorithms that churn through data in their own language, one based on a vocabulary of tokens.

Tokens are tiny units of data that come from breaking down bigger chunks of information. AI models process tokens to learn the relationships between them and unlock capabilities including prediction, generation and reasoning. The faster tokens can be processed, the faster models can learn and respond.

AI factories — a new class of data centers designed to accelerate AI workloads — efficiently crunch through tokens, converting them from the language of AI to the currency of AI, which is intelligence.

With AI factories, enterprises can take advantage of the latest full-stack computing solutions to process more tokens at lower computational cost, creating additional value for customers. In one case, integrating software optimizations and adopting the latest generation NVIDIA GPUs reduced cost per token by 20x compared to unoptimized processes on previous-generation GPUs — delivering 25x more revenue in just four weeks.

Nvidia

Tokens are so important, I think we should dig deeper.

Training an AI model starts with the tokenization of the training dataset.

Based on the size of the training data, the number of tokens can number in the billions or trillions — and, per the pretraining scaling law, the more tokens used for training, the better the quality of the AI model.

As an AI model is pretrained, it’s tested by being shown a sample set of tokens and asked to predict the next token. Based on whether or not its prediction is correct, the model updates itself to improve its next guess. This process is repeated until the model learns from its mistakes and reaches a target level of accuracy, known as model convergence.

After pretraining, models are further improved by post-training, where they continue to learn on a subset of tokens relevant to the use case where they’ll be deployed. These could be tokens with domain-specific information for an application in law, medicine or business — or tokens that help tailor the model to a specific task, like reasoning, chat or translation. The goal is a model that generates the right tokens to deliver a correct response based on a user’s query — a skill better known as inference.

How Are Tokens Used During AI Inference and Reasoning? 

During inference, an AI receives a prompt — which, depending on the model, may be text, image, audio clip, video, sensor data or even gene sequence — that it translates into a series of tokens. The model processes these input tokens, generates its response as tokens and then translates it to the user’s expected format.

Input and output languages can be different, such as in a model that translates English to Japanese, or one that converts text prompts into images.

To understand a complete prompt, AI models must be able to process multiple tokens at once. Many models have a specified limit, referred to as a context window — and different use cases require different context window sizes.

A model that can process a few thousand tokens at once might be able to process a single high-resolution image or a few pages of text. With a context length of tens of thousands of tokens, another model might be able to summarize a whole novel or an hourlong podcast episode. Some models even provide context lengths of a million or more tokens, allowing users to input massive data sources for the AI to analyze.

Reasoning AI models, the latest advancement in LLMs, can tackle more complex queries by treating tokens differently than before. Here, in addition to input and output tokens, the model generates a host of reasoning tokens over minutes or hours as it thinks about how to solve a given problem.

These reasoning tokens allow for better responses to complex questions, just like how a person can formulate a better answer given time to work through a problem. The corresponding increase in tokens per prompt can require over 100x more compute compared with a single inference pass on a traditional LLM — an example of test-time scaling, aka long thinking.

How Do Tokens Drive AI Economics? 

During pretraining and post-training, tokens equate to investment into intelligence, and during inference, they drive cost and revenue. So as AI applications proliferate, new principles of AI economics are emerging.

AI factories are built to sustain high-volume inference, manufacturing intelligence for users by turning tokens into monetizable insights. That’s why a growing number of AI services are measuring the value of their products based on the number of tokens consumed and generated, offering pricing plans based on a model’s rates of token input and output.

Some token pricing plans offer users a set number of tokens shared between input and output. Based on these token limits, a customer could use a short text prompt that uses just a few tokens for the input to generate a lengthy, AI-generated response that took thousands of tokens as the output. Or a user could spend the majority of their tokens on input, providing an AI model with a set of documents to summarize into a few bullet points.

To serve a high volume of concurrent users, some AI services also set token limits, the maximum number of tokens per minute generated for an individual user.

Tokens also define the user experience for AI services. Time to first token, the latency between a user submitting a prompt and the AI model starting to respond, and inter-token or token-to-token latency, the rate at which subsequent output tokens are generated, determine how an end user experiences the output of an AI application.

There are tradeoffs involved for each metric, and the right balance is dictated by use case.

For LLM-based chatbots, shortening the time to first token can help improve user engagement by maintaining a conversational pace without unnatural pauses. Optimizing inter-token latency can enable text generation models to match the reading speed of an average person, or video generation models to achieve a desired frame rate. For AI models engaging in long thinking and research, more emphasis is placed on generating high-quality tokens, even if it adds latency.

Developers have to strike a balance between these metrics to deliver high-quality user experiences with optimal throughput, the number of tokens an AI factory can generate.

To address these challenges, the NVIDIA AI platform offers a vast collection of softwaremicroservices and blueprints alongside powerful accelerated computing infrastructure — a flexible, full-stack solution that enables enterprises to evolve, optimize and scale AI factories to generate the next wave of intelligence across industries.

Understanding how to optimize token usage across different tasks can help developers, enterprises and even end users reap the most value from their AI applications.

Nvidia

Strategy – What Would Spock Do

We need to know where to invest to capitalise on the incredible potential of AI to transform the world. We need to think logically, which may also mean thinking outside the box. Analysts live in a world of spreadsheets. It helps to justify their existence. But spreadsheets don’t appear to provide the right answers. On the contrary, spreadsheets have led to many wrong decisions because the most exciting companies invariably look expensive.

It’s almost a Catch-22, the most exciting companies will ALWAYS look too expensive to buy.

We need to apply common sense.. Forget about PE ratios, valuations, all the minutiae of the stock market and ask yourself the obvious question. Which are the most important businesses driving the global AI boom?

It is the companies building the infrastructure of that boom, the companies helping the world apply AI to their enterprises and to government generally and probably also further down the road, the companies that move quickest to apply AI to their operations. My Top 20 list is full of companies which fall into one or the other of these categories. They are the obvious ones in which to invest.

Most analysts don’t dispute that but agonise over whether their shares are correctly priced. This is a bit like the medieval arguments over how many angels could dance on the head of a pin. The only answer to both these questions is God alone knows. If an all-knowing creator doesn’t exist, nobody knows, and God is not sharing his insights.

Hence, my argument is that once you have chosen which shares to buy, use the inevitable share price volatility to make sure you invest at a sensible price. Even then, we don’t know what a sensible price is, so buying a few at intervals makes sense. Let the market decide the price. As long as most of the shares you buy are in long-term uptrends, all should be well.

The advantage of doing this is that you can invest with reasonable safety in the most exciting names.

Further reading

More >
Subscribers Only
Investment Alerts

Mega Caps On The March

August 23, 2025
Subscribers Only
Investment Alerts

QQQ3 Hits A Wall; High Valuation, High Growth, High Volatility

August 22, 2025
Subscribers Only
Investment Alerts

Tech Earnings Poised To Explode In 2026

August 8, 2025
Subscribers Only
Investment Alerts

WOW! Sizzling Results Underwrite AI Boom

August 5, 2025