Connect with us

Technology

Eight lessons from Intel’s talk on innovation

Published

on

Intel has enveloped up its yearly Development occasion by San Jose, where the chipmaker provided us with a brief look at what’s descending the pipeline throughout the following couple of years. On the off chance that you don’t have an extra 90 minutes to plunk down and watch Chief Pat Gelsinger’s featured discussion, here are a few significant things we learned.

Meteor Lake will send off on December fourteenth

The organization authoritatively presented its “Meteor Lake” age (referred to formally as the Intel Center Ultra) to the world at the Development featured discussion. These will succeed the thirteenth Gen “Raptor Lake” line; they will be the main chips based on the new Intel 4 interaction and its first with a devoted computer based intelligence coprocessor inside.

They’re additionally Intel’s most memorable shopper computer processors to join together unique chiplets for every part (which is something that contenders like AMD and Qualcomm have been accomplishing for some time). For this situation, there will be four tiles: figure, designs, SoC, and I/O.

The SoC tile is basically a low-power processor in itself. Notwithstanding highlights like remote availability, local HDMI 2.1 and DP 2.1 norms, and a coordinated memory regulator, the tile incorporates discrete “low power island” E-centers that are explicitly planned for lighter responsibilities. The thought is that this arrangement could offload lighter cycles from the power-sucking process tile. This, in principle, would permit the chips to save power, which is the reason Intel’s considering Meteor Lake the most proficient client processor it’s always made.

On the gaming front, Meteor Lake can integrate Intel’s Circular segment designs straightforwardly on-chip. Only one out of every odd Meteor Lake processor will get these — they’re coming to “select MTL processor-fueled frameworks with double channel memory” as indicated by the fine print.

Intel will challenge AMD’s 3D V-Cache…at some point

In a back and forth discussion, Pat Gelsinger was found out if Intel would challenge the 3D V-Store innovation that powers work area chips like its Ryzen 7 7800X3D, tech which was additionally recently disclosed for PCs recently. Gelsinger affirmed accordingly that Intel has a comparable thought on its guide, however it will not be essential for the Meteor Lake age.

For those new, 3D V-Reserve permits AMD to stack extra store (high velocity, transient memory) straightforwardly onto its computer processor. The outcomes we saw from the ROG Strix Scar X3D (the colossal RTX 4090 gaming PC where 3D V-Reserve made its portable introduction) were perfect for AMD and troubling for Intel. An incredibly strong gadget destroys Intel’s 4090 contributions.

Intel needs a reaction to 3D V-Reserve to keep steady over the top of the line gaming market. Seems as though it’s looking into it.

Lunar Lake exists

In like, some limit at any rate. The Day 1 feature incorporated the world’s most memorable appearing of a Lunar Lake framework; we saw the PC create a Taylor Quick style tune and an image of a giraffe in a rancher cap. You know, as PCs do.

Intel likewise affirmed that Lunar Lake is on target to deliver in 2024. Like its ancestor, the Meteor Lake spin-off will utilize Intel’s Foveros plan. It’s likewise expected to stamp the business introduction of Intel’s 1.8nm assembling process, known as Intel 18A. (In human terms: Its semiconductors will be ridiculously, truly cracking little.)

“Jaguar Lake” is well in progress

Gelsinger affirmed that a computer chip age called “Jaguar Lake” is set to be reported in 2025, and that the organization has started dealing with it. ( This name was released recently after an Intel engineer inadvertently put it on LinkedIn.) We know barely anything about Puma Lake at the present time, however Intel says it’s scheduled to enter creation in fabs when Q1 of 2024.

For those following along (and can we just be real, I realize all of you are), this implies the movement will probably go: Meteor (2023), Bolt (2024), Lunar (2024, most likely), Jaguar (2025).

Particular chiplets are in progress

Gelsinger flaunted Pike River, which is the world’s most memorable working UCIe-empowered chiplet-based processor. UCIe means “Widespread Chiplet Interconnect Express”, and basically a fitting and-play standard can permit different silicon modules to cooperate in one chiplet bundle. One chipmaker could get another organization’s chiplet and snap it into their plan. In principle, this would permit chipmakers to more readily spend significant time in particular kinds of chiplets and put up their items for sale to the public all the more rapidly.

Intel will utilize the UCIe interface post-Bolt Lake, and it’s the primary organization to show useful silicon. ( Intel gave the primary adaptation of the UCIe spec to the norms body that is creating it.)

Sap is out, glass is in

Intel presently involves a natural tar as the groundwork of its chips. The organization declared that started progressing to new innovation will allow chips to sit on a bed of glass. This ought to give Intel more space to pack extra semiconductors, as well as (Intel expects) better information move, less twisting, and less mechanical breakage under heat.

Try not to get excessively energized: This isn’t coming until the last part of ten years, and it will initially show up in like, monster server farm stuff.

Several correspondents got to see this creation cycle inside Intel’s processing plant. CNET has some cool photographs.

Xeon things are occurring

Gelsinger declared the impending Sierra Woods Xeon processor, which has 288 E-centers. You know, in the event you’re finding that anyway many centers you have right presently is lacking for your lawn server farm.

Intel additionally affirmed that the fifth Gen “Emerald Rapids” Xeon line will send off on December fourteenth of this current year.

Technology

Microsoft Expands Copilot Voice and Think Deeper

Published

on

Microsoft Expands Copilot Voice and Think Deeper

Microsoft is taking a major step forward by offering unlimited access to Copilot Voice and Think Deeper, marking two years since the AI-powered Copilot was first integrated into Bing search. This update comes shortly after the tech giant revamped its Copilot Pro subscription and bundled advanced AI features into Microsoft 365.

What’s Changing?

Microsoft remains committed to its $20 per month Copilot Pro plan, ensuring that subscribers continue to enjoy premium benefits. According to the company, Copilot Pro users will receive:

  • Preferred access to the latest AI models during peak hours.
  • Early access to experimental AI features, with more updates expected soon.
  • Extended use of Copilot within popular Microsoft 365 apps like Word, Excel, and PowerPoint.

The Impact on Users

This move signals Microsoft’s dedication to enhancing AI-driven productivity tools. By expanding access to Copilot’s powerful features, users can expect improved efficiency, smarter assistance, and seamless integration across Microsoft’s ecosystem.

As AI technology continues to evolve, Microsoft is positioning itself at the forefront of innovation, ensuring both casual users and professionals can leverage the best AI tools available.

Stay tuned for further updates as Microsoft rolls out more enhancements to its AI offerings.

Continue Reading

Technology

Google Launches Free AI Coding Tool for Individual Developers

Published

on

Google Launches Free AI Coding Tool for Individual Developers

Google has introduced a free version of Gemini Code Assistant, its AI-powered coding assistant, for solo developers worldwide. The tool, previously available only to enterprise users, is now in public preview, making advanced AI-assisted coding accessible to students, freelancers, hobbyists, and startups.

More Features, Fewer Limits

Unlike competing tools such as GitHub Copilot, which limits free users to 2,000 code completions per month, Google is offering up to 180,000 code completions—a significantly higher cap designed to accommodate even the most active developers.

“Now anyone can easily learn, generate code snippets, debug, and modify applications without switching between multiple windows,” said Ryan J. Salva, Google’s senior director of product management.

AI-Powered Coding Assistance

Gemini Code Assist for individuals is powered by Google’s Gemini 2.0 AI model and offers:
Auto-completion of code while typing
Generation of entire code blocks based on prompts
Debugging assistance via an interactive chatbot

The tool integrates with popular developer environments like Visual Studio Code, GitHub, and JetBrains, supporting a wide range of programming languages. Developers can use natural language prompts, such as:
Create an HTML form with fields for name, email, and message, plus a submit button.”

With support for 38 programming languages and a 128,000-token memory for processing complex prompts, Gemini Code Assist provides a robust AI-driven coding experience.

Enterprise Features Still Require a Subscription

While the free tier is generous, advanced features like productivity analytics, Google Cloud integrations, and custom AI tuning remain exclusive to paid Standard and Enterprise plans.

With this move, Google aims to compete more aggressively in the AI coding assistant market, offering developers a powerful and unrestricted alternative to existing tools.

Continue Reading

Technology

Elon Musk Unveils Grok-3: A Game-Changing AI Chatbot to Rival ChatGPT

Published

on

Elon Musk Unveils Grok-3: A Game-Changing AI Chatbot to Rival ChatGPT

Elon Musk’s artificial intelligence company xAI has unveiled its latest chatbot, Grok-3, which aims to compete with leading AI models such as OpenAI’s ChatGPT and China’s DeepSeek. Grok-3 is now available to Premium+ subscribers on Musk’s social media platform x (formerly Twitter) and is also available through xAI’s mobile app and the new SuperGrok subscription tier on Grok.com.

Advanced capabilities and performance

Grok-3 has ten times the computing power of its predecessor, Grok-2. Initial tests show that Grok-3 outperforms models from OpenAI, Google, and DeepSeek, particularly in areas such as math, science, and coding. The chatbot features advanced reasoning features capable of decomposing complex questions into manageable tasks. Users can interact with Grok-3 in two different ways: “Think,” which performs step-by-step reasoning, and “Big Brain,” which is designed for more difficult tasks.

Strategic Investments and Infrastructure

To support the development of Grok-3, xAI has made major investments in its supercomputer cluster, Colossus, which is currently the largest globally. This infrastructure underscores the company’s commitment to advancing AI technology and maintaining a competitive edge in the industry.

New Offerings and Future Plans

Along with Grok-3, xAI has also introduced a logic-based chatbot called DeepSearch, designed to enhance research, brainstorming, and data analysis tasks. This tool aims to provide users with more insightful and relevant information. Looking to the future, xAI plans to release Grok-2 as an open-source model, encouraging community participation and further development. Additionally, upcoming improvements for Grok-3 include a synthesized voice feature, which aims to improve user interaction and accessibility.

Market position and competition

The launch of Grok-3 positions xAI as a major competitor in the AI ​​chatbot market, directly challenging established models from OpenAI and emerging competitors such as DeepSeek. While Grok-3’s performance claims are yet to be independently verified, early indications suggest it could have a significant impact on the AI ​​landscape. xAI is actively seeking $10 billion in investment from major companies, demonstrating its strong belief in their technological advancements and market potential.

Continue Reading

Trending

error: Content is protected !!