1. Home
  2. world wide web

world wide web

OpenAI launches web crawler ‘GPTBot’ amid plans for next model: GPT-5

ChatGPT users have the option to scrap the web crawler by adding a “disallow” command to a standard file on the server.

Artificial intelligence firm OpenAI has launched “GPTBot” — its new web crawling tool which it says could potentially be used to improve future ChatGPT models.

“Web pages crawled with the GPTBot user agent may potentially be used to improve future models,” OpenAI said in a new blog post, adding it could improve accuracy and expand the capabilities of future iterations.

A web crawler, sometimes called a web spider, is a type of bot that indexes the content of websites across the internet. Search engines like Google and Bing use them in order for the websites to show up in search results. 

OpenAI said the web crawler will collect publicly available data from the world wide web, but will filter out sources that require paywalled content, or is known to gather personally identifiable information, or has text that violates its policies.

It should be noted that website owners can deny the web crawler by adding a “disallow” command to a standard file on the server.

Instructions to “disallow” GPTBot for ChatGPT users. Source: OpenAI

The new crawler comes three weeks after the firm filed a trademark application for “GPT-5,” the anticipated successor to the current GPT-4 model.

The application was filed at the United States Patent and Trademark Office on July 18, and covers the use of the term “GPT-5,” which includes software for AI-based human speech and text, converting audio into text and voice and speech recognition.

However, observers may not want to hold their breath for the next iteration of ChatGPT just yet. In June, OpenAI’s founder and CEO Sam Altman said the firm is “nowhere close” to beginning training GPT-5, explaining that several safety audits need to be conducted prior to starting.

Related: 11 ChatGPT prompts for maximum productivity

Meanwhile, Concerns have been raised over OpenAI’s data-collecting tactics of late, particularly revolving around copyright and consent.

Japan’s privacy watchdog issued a warning to OpenAI about collecting sensitive data without permission in June, while Italy temporarily banned the use of ChatGPT after alleging it breached various European Union privacy laws in April.

In late June, a class action was filed against OpenAI by 16 plaintiffs alleging the AI firm to have accessed private information from ChatGPT user interactions.

If these allegations are proven to be accurate, OpenAI — and Microsoft, who was named as a defendant — will be in breach of the Computer Fraud and Abuse Act, a law with a precedent for web-scraping cases.

Magazine: AI Eye: AI travel booking hilariously bad, 3 weird uses for ChatGPT, crypto plugins

Bitcoin Cash Rallies Ahead of Upcoming Halving and Upgrade

A brief history of the internet

Gain insights into the key milestones and trends that have shaped the internet into the global phenomenon it is today.

The internet has become an integral part of our daily lives, revolutionizing how we communicate, access information and conduct business. It has evolved over several decades, starting from humble beginnings as a research project and evolving into a global network that connects billions of people worldwide. This article presents a timeline of the Internet’s evolution, highlighting key milestones that have shaped its development.

The birth of ARPANET (1969)

The United States Department of Defense created the Advanced Research Projects Agency Network (ARPANET) in the late 1960s, which is when the internet first emerged. Thanks to ARPANET, research institutions and universities can communicate and share data more easily today. The first message sent via ARPANET on Oct. 29, 1969, was a crucial turning point in the development of the internet.

TCP/IP and the birth of the internet protocol suite (1970s)

The transmission control protocol/internet protocol (TCP/IP) is a protocol framework created in the 1970s that laid the groundwork for the current internet. TCP/IP offered a set of uniform guidelines for sending and receiving data packets across networks, allowing various kinds of computers and networks to communicate without any problems. This innovation gave rise to the internet protocol suite, which serves as the foundation of the internet.

Usenet and email’s development (1980s)

The internet saw considerable breakthroughs in communication technologies during the 1980s. During this period, email — a crucial element of contemporary online communication — was developed. It improved worldwide connectedness by enabling users to send and receive electronic messages through networks. A distributed discussion system called Usenet was also created, allowing users to participate in newsgroups and exchange information on a variety of subjects.

The World Wide Web (1990s)

With the creation of the World Wide Web (WWW), the 1990s were a transformative decade in the history of the internet. British computer scientist Tim Berners-Lee first proposed a system of linked hypertext documents accessible via the internet.

In 1991, the first web page was published, and by 1993, web browsers like Mosaic and Netscape Navigator had been created, making it more user-friendly to navigate the web.

The advent of search engines, such as Yahoo and Google, further improved online information discovery.

E-commerce and the dot-com boom (late 1990s)

The dot-com boom — a time of explosive growth and investment in internet-based businesses — occurred in the late 1990s. E-commerce flourished during this period, with big players in online retail emerging, like Amazon and eBay. Advancements in web technologies and the widespread use of secure online payment methods fueled the expansion of online shopping, revolutionizing the buying and selling of goods and services.

The emergence of Web2 and social media (Early 2000s)

Social networking websites and the idea of Web2 came into existence in the 2000s. MySpace, Facebook (now Meta) and Twitter, among other websites, transformed online communication by enabling users to set up accounts, exchange information, and connect with people all over the world.

Web2 promoted user-generated content, interaction, and collaboration, making the online experience more interactive and dynamic. Additionally, cloud computing emerged in the early 2000s, providing scalable, adaptable computing resources to individuals and companies.

Remote data storage, access to processing power and the capacity to host applications were all made available by services like Amazon Web Services (AWS) in 2006.

The rise of internet beyond late 2000s

The internet continues to evolve rapidly, with technological advancements, connectivity and the integration of digital services into our daily lives. Here’s a brief history of the internet beyond the late 2000s:

Mobile internet and the app revolution (2010s)

  • 2008: Apple’s App Store is launched, revolutionizing mobile app distribution.
  • 2010: The proliferation of smartphones and mobile devices leads to the rise of the mobile internet.
  • 2012: Google Play (formerly Android Market) was launched as the primary app store for Android devices.
  • Mobile applications offer a wide range of services, including communication, entertainment, productivity and e-commerce.

First commercial transaction using Bitcoin

  • 2010: The first commercial transaction using Bitcoin (BTC) occurred, marking a pivotal moment in internet history.
  • May 22, 2010: The date is commonly known as Bitcoin Pizza Day, when Laszlo Hanyecz exchanged 10,000 BTC for two pizzas, highlighting the potential of cryptocurrencies in real-world transactions. This event showcased the disruptive power of digital currency and its ability to revolutionize traditional financial systems.

Expansion of broadband and high-speed internet

  • Early 2010s: Broadband internet access continues to expand globally.
  • Improved online experiences, faster data transfers and the ability to stream high-definition content.

The rise of social networking and messaging apps

  • Late 2000s to early 2010s: Social networking platforms like Facebook and Twitter continue to dominate.
  • 2010–2013: Messaging apps like WhatsApp (2010), WeChat (2011) and Telegram (2013) gain popularity, providing real-time communication and sharing capabilities.

Cloud computing and storage

  • Early 2010s: Cloud computing becomes increasingly prevalent.
  • 2006: AWS offers scalable computing resources.
  • 2010: Microsoft Azure and Google Cloud Platform enter the market.
  • 2007–2012: Cloud storage services like Dropbox (2007) and Google Drive (2012) gain popularity, providing convenient file storage and synchronization.

Related: 7 real-world cloud computing examples to know

Internet of Things and connected devices

  • Late 2000s to present: The Internet of Things (IoT) has continued to grow since its inception in the late 2000s.
  • Smart home devices, wearable technology, and industrial applications gain momentum.
  • Interconnectivity enables automation, remote monitoring and data collection.

Streaming and on-demand entertainment

  • Late 2000s and ongoing: Streaming platforms transform the entertainment industry.
  • 2006: Amazon Prime Video was launched as Amazon Unbox.
  • 2007: Netflix introduces its streaming service.
  • 2008: Hulu launched a free, ad-supported streaming service, later introducing subscription-based plans. Spotify also introduced a music streaming service in the same year.
  • 2015: Apple Music gained popularity.
  • 2019: Disney+ gained popularity with its extensive library of Disney, Pixar, Marvel, Star Wars and National Geographic content.

Enhanced online security and privacy concerns

  • Ongoing concern: Online security and privacy have become more prominent issues.
  • High-profile data breaches and cyberattacks raise awareness about the importance of secure practices.
  • Encryption technologies, secure protocols and multifactor authentication have become important tools to protect user privacy and data integrity.

Artificial intelligence and machine learning

Related: 5 key features of machine learning

Expansion of 5G and next-generation networks

  • Late 2010s and ongoing: The deployment of fifth-generation (5G) networks expands.
  • 2019: Commercial deployment of 5G begins in select regions.
  • 5G promises faster speeds, lower latency and increased network capacity.
  • 5G enables emerging technologies like autonomous vehicles, virtual and augmented reality.

The rise of Web3

Gavin Wood, a co-founder of Ethereum, coined the term “Web3” in 2014, signaling a significant turning point in the internet’s development. Also, initiatives like Vitalik Buterin’s Ethereum, which offers smart contract functionality to develop decentralized applications (DApps), gained momentum. DApps, decentralized finance (DeFi) and nonfungible tokens (NFTs) all benefited from the development of a thriving ecosystem thanks to the Ethereum blockchain

The decentralized autonomous organization (DAO), known as The DAO, grabbed headlines in 2017 for its cutting-edge decentralized governance experiment. Despite its difficulties and weaknesses, it established the framework for the concept of group decision-making via blockchain-based platforms.

The idea of Web3 evolved over time, moving beyond Ethereum. Other blockchain platforms with distinct features and focuses, such as Polkadot, Solana and BNB Smart Chain, have also appeared. These platforms promoted a competitive environment for Web3 development by addressing scalability, interoperability and developer experience.

Web3 also includes self-sovereign identification, in which people are in charge of their personal data and can choose to share it with others they can trust. Sovrin, uPort and SelfKey are a few examples of decentralized identification protocols leading the way for a more user- and privacy-centric internet.

Growing demands for data privacy, ownership and transparency align with the emergence of Web3. Users are now more conscious of the value of their own data and the dangers of centralized platforms. By providing options that promote privacy and give users control over their digital lives, Web3 technology empowers people.

Additionally, Web3 has experienced rising popularity in the area of digital collectibles and art via NFTs. Blockchain technology has enabled these one-of-a-kind tokens to provide verifiable ownership and provenance for digital goods. This has revolutionized the art market, giving producers and collectors new opportunities.

With continued attempts to enhance scalability, usability and interoperability, Web3’s journey is far from over. As the movement picks up steam, it challenges the conventionally centralized paradigm of the internet, while imagining a future when users will have more sovereignty over their data, privacy and decision-making.

The future of the internet

The internet’s future is incredibly promising in terms of revolutionary developments. The internet will become increasingly ingrained in our lives due to the continued development of technologies like AI, 5G networks and the IoT, with faster, more dependable connectivity, enabling seamless communication and immersive experiences.

People will have more control over their data and online experiences in Web3 and decentralized technologies. Privacy and cybersecurity will become more and more crucial as the digital world develops, necessitating stronger security measures. The future of the internet is full of promise for innovation, connectivity and a digital environment open to all users.

Bitcoin Cash Rallies Ahead of Upcoming Halving and Upgrade

History of computing: From Abacus to quantum computers

Journey through time and witness the remarkable advancements that have shaped the computing world we know today.

From the earliest mechanical devices to the most advanced quantum computers of the present, the history of computing is a fascinating trip spanning thousands of years. 

Let’s explore the significant turning points in computing history, starting with the abacus and progressing through quantum computers.

Abacus (3,000 BCE)

The abacus, which dates back to 3,000 BCE, is frequently cited as the earliest known computer device. To accomplish fundamental arithmetic computations, a set of rods or wires with beads were pushed back and forth.

Mechanical calculators (17th to 19th centuries)

Several mechanical calculators, including Blaise Pascal’s Pascaline and Gottfried Leibniz’s stepped reckoner, were developed during this time. These devices used gears, wheels and other mechanical components to carry out calculations.

Analytical Engine (1837)

Charles Babbage invented the analytical engine, a mechanical computer that could execute a variety of calculations, in 1837. It was never constructed during Babbage’s lifetime, but because it used punched cards for input and output, it is regarded as a forerunner to current computers.

Tabulating Machines (late 19th to early 20th centuries)

Herman Hollerith invented tabulating machines in the late 19th and early 20th centuries, which processed and analyzed data using punched cards. These devices were crucial to the advancement of modern computers and were employed for tasks like tabulating census data.

Vacuum Tube Computers (1930s–1940s)

Vacuum tube computers, including the Atanasoff-Berry Computer (ABC) and the Electronic Numerical Integrator and Computer (ENIAC), signaled the transition from mechanical to electronic computing in the 1930s and 40s. Vacuum tubes made it possible for faster calculations and more advanced functionality.

Transistors (1947)

John Bardeen, Walter Brattain and William Shockley’s 1947 creation of the transistor at Bell Laboratories revolutionized computers. Smaller, quicker computers were created as a result of the replacement of cumbersome vacuum tubes by smaller, more dependable electrical components known as transistors.

Integrated Circuits (1958)

In 1958, Jack Kilby and Robert Noyce independently developed the integrated circuit, which allowed the integration of numerous transistors and other electrical components onto a single chip. This innovation cleared the path for the creation of miniaturized electronics and microprocessors.

Personal Computers (1970s–1980s)

The Altair 8800 and later computers like the Apple II and IBM PC helped popularize personal computing in the 1970s and 80s. These cheaper and more user-friendly computers made computing more accessible to both individuals and companies.

Internet and World Wide Web (1990s)

With the advent of the internet and the growth of the World Wide Web, computing became a vast worldwide network of interconnected devices. Tim Berners-Lee created the HTTP, HTML and URL protocols to make simple information sharing and browsing possible.

Mobile and cloud computing (2000s)

The emergence of smartphones and tablets, as well as advancements in wireless technology, helped facilitate the widespread use of mobile computing. Furthermore, the idea of cloud computing arose, offering scalable and on-demand access to computing resources via the internet.

Quantum computers (present)

Quantum computing is a new technology that uses the laws of quantum mechanics to carry out calculations. Quantum computers use qubits, which can exist in superposition and entangled states, as opposed to classical computers, which use binary bits (0s and 1s). Though they are still in the early phases of research, viable quantum computers have the ability to handle difficult problems more quickly than classical computers.

The future of computing

The developments achieved from the abacus to quantum computers have created an exhilarating and constantly changing landscape for the field of computing. Here are some significant developments and opportunities for computers in the future:

Artificial intelligence (AI) and machine learning (ML)

Artificial intelligence and machine learning will continue to be key factors in the development of computing. These technologies, which give computers the capacity to learn, reason and make judgements, have made advancements in fields such as natural language processing (NLP), computer vision and robotics possible.

AI-driven systems will advance in sophistication, having an impact on a number of sectors, including healthcare, banking, transportation and customer service.

Internet of Things (IoT)

The linking of numerous devices and items that enables communication and data sharing is referred to as the Internet of Things. The IoT will develop more as processing power keeps rising and becomes more energy-efficient.

There will be an abundance of connected devices, enabling smart homes, smart cities and productive industrial operations. The IoT will produce enormous amounts of data, necessitating sophisticated computing techniques for analysis and decision-making.

Edge computing

Rather than depending only on centralized cloud infrastructure, edge computing processes data closer to the source. Edge computing will be more significant as IoT devices and real-time applications expand.

Edge computing offers quicker and more effective processing by lowering latency and enhancing data privacy, which benefits industries including autonomous vehicles, healthcare monitoring and smart grids.

Related: 10 emerging technologies in computer science that will shape the future

Quantum internet and quantum communication

The creation of a quantum internet is being investigated in addition to quantum computing. The principles of quantum physics are used in quantum communication to secure and send data.

A global network of safe communication and data transfer could be made possible via quantum networks, which could offer improved security, lightning-fast and impenetrable encryption, and quantum teleportation.

Neuromorphic computing

The goal of neuromorphic computing, which draws inspiration from the structure and function of the human brain, is to create computer systems that resemble neural networks.

For tasks like pattern recognition, data processing and cognitive computing, these systems might provide greater efficiency and performance. Neuromorphic computing may facilitate the development of artificial intelligence and brain-machine interactions.

Related: What is black-box AI, and how does it work?

Ethical and responsible computing

As computers develop, ethical issues take on more significance. It is necessary to address concerns such as privacy, prejudice in AI algorithms, cybersecurity and the effect of automation on employment and society. In order to ensure that technology is used for the benefit of humanity, responsible practices, laws and frameworks will be necessary for the future of computing.

The potential for innovation and revolution in a variety of fields is enormous for the future of computing. AI, quantum computing, IoT, edge computing, quantum communication, neuromorphic computing and ethical concerns will shape the future of computing, enabling us to solve difficult problems and open up new opportunities for advancement.

Bitcoin Cash Rallies Ahead of Upcoming Halving and Upgrade

World Wide Web Inventor Tim Berners-Lee Says Crypto Is ‘Really Dangerous’ but Can Be Useful for Remittances

World Wide Web Inventor Tim Berners-Lee Says Crypto Is ‘Really Dangerous’ but Can Be Useful for RemittancesWorld Wide Web inventor Sir Tim Berners-Lee says cryptocurrency is “really dangerous” and “only speculative.” While claiming that crypto is for those who “want to have a kick out of gambling,” he noted that it could be useful for remittances. Sir Tim Berners-Lee on Crypto Sir Tim Berners-Lee, the British computer scientist who is widely […]

Bitcoin Cash Rallies Ahead of Upcoming Halving and Upgrade

World Wide Web Inventor Tim Berners-Lee Sells NFT for $5.4M — ‘Embarrassing’ Coding Error Spotted in NFT

World Wide Web Inventor Tim Berners-Lee Sells NFT for .4M —  ‘Embarrassing’ Coding Error Spotted in NFTSir Timothy John Berners-Lee, the English computer scientist who is well known for inventing the world wide web has sold a non-fungible token (NFT) in an online auction hosted by the auction house Sotheby’s. The NFT is basically a video of the World Wide Web’s source code created in Python and the collectible sold for […]

Bitcoin Cash Rallies Ahead of Upcoming Halving and Upgrade

Bargain? World Wide Web’s source code NFT sells for $5.4M at Sotheby’s

The source code for the World Wide Web has been sold by the inventor for $5.4 million.

The inventor of the World Wide Web, Sir Tim Berners-Lee, has sold an NFT of the web's source code for $5.4 million at fine art auction house Sotheby’s.

The piece titled ‘This Changed Everything’ includes a time-stamped file of the source code's 9,555 lines, a high-fidelity image, a 30-minute animation of the code being written, along with a letter written by Berners-Lee.

While $5.4M is a significant sum, it’s a far cry from the $69 million record set in March for Beeple’s Everydays and less than some observers had predicted. Delphi Digital’s Piers Kicks said he’d been expecting a sale around $10M, noting:

According to Berners-Lee, he and his wife will donate the proceeds of the auction to causes supported by the family.

In a statement to the press released by Sotheby’s, Berners-Lee discussed the future of the internet and expressed his hope that it would remain open to allow it to be a continual source of creativity, technical innovation, and social transformation. These ideals were the inspiration behind Berners-Lee move into the NFT space, he said:

“NFTs, be they artworks or digital artifacts like this, are the latest playful creations in this realm, and the most appropriate means of ownership that exists. They are the ideal way to package the origins behind the web.”

Related: Hype is over: How NFTs and art will benefit from each other moving forward

The high-profile auction of the web's source code isn’t the only multi-million dollar sale of a digital artwork hosted by a premier auction house this week. On June 30th, Christie’s Auction house closed a $2.1 million auction for the works of transgender digital art FEWOCiOUS.

The NFT “Hello, i’m Victor (FEWOCiOUS) and This Is My Life,” includes five individual pieces each depicting a year in the artist's formative years from ages 14 to 18 as he transitioned to male. The works illustrate the artist’s struggles with loneliness and identity as he strove to become an artist.

The trajectory of the young artist’s career is indicative of the upward mobility the digital market offers to digital artists. FEWOCiOUS made his first tokenized sale on the marketplace SuperRare for $6,000 in September of last year and was soon selling artworks valued at over $1 million on Nifty Gateway.

Data provided by Cryptoart.io shows that combined sales from the digital art markets have shrunk to $18.3 million in June from their peak of $205 million in March. Winklevoss-owned marketplace Nifty Gateway saw a 94% decline in sales from a peak of $145 million to $7.6 million.

Some digital asset platforms have prospered. Relative newcomer to the digital art space, hic et nunc, has seen sales grow 276% from $717,000 in March to $2.7 million in June, capturing capturing 14.7% of the market.

Source: cryptoart.io/data

Bitcoin Cash Rallies Ahead of Upcoming Halving and Upgrade