How we test at Cybernews
Cybernews has been testing and reviewing hundreds of digital products since 2019 to help you make informed buying decisions and stay safe online. Our expert research team tests the products we review in-house – from VPNs and antivirus software to ad blockers and broadband.
The Cybernews international specialist team is full of experts – experienced security researchers, product testers, and writers with specialist knowledge and expertise. We all work together to keep our reviews informative, objective, comprehensive, and dedicated to helping you, the user.
In the review writing process, we do not only rely on the in-house testing results. To provide the most well-rounded and objective information, we also take into account independent test lab results, and user reviews, and rely on our expertise and years of experience to ensure our reviews are relevant, informative, and reliable.
What we promise
Cybernews’ mission is to provide you with product reviews that reflect the quality of a product, shed light on its strengths and weaknesses, and offer actionable advice.
We seek to offer you relevant, up-to-date, and research-backed information on product performance, ease of use, use case scenarios, and areas for improvement. If we think a product is not up to standard, we will tell you why.
In addition to our expert teams of researchers and writers, we also employ freelance specialists and take real user reviews into account to provide you with as objective and in-depth analysis of a product as possible.
Why should you trust our reviews?
We do not make baseless claims and all our reviews are empirical and evidence-based, and follow clear testing methodologies while keeping our audience’s needs in mind. Here’s why you can trust Cybernews:
- Expertise. The Cybernews research and writer teams are experts in their fields, with vast experience in cybersecurity, software testing, and information technology. Our backgrounds in academia, the digital cybersecurity products industry, and hands-on research provide us with the expertise needed for reliable evaluations.
- Transparency and independence. We conduct product testing and evaluation in-house, free from influence or bias from external parties. Our evaluations are based on objective criteria and empirical evidence rather than subjective opinions or personal preferences. Additionally, we clearly outline the criteria, methodology, and sources used in our testing and evaluations.
- Credibility of testing methodology. To ensure the credibility of our testing and evaluation, we follow strict, empirical evidence-based testing protocols and methods designed to provide a comprehensive assessment of each product. Additionally, we source information from reputable sources such as industry experts, official product-related sources, independent testing laboratories, and user feedback. Additionally, we do not use generative AI for product testing.
- Consistent review process. To ensure consistency of information and review process, we apply the same standards and criteria to all products in a given category. Our reviews provide consistent and reliable information and enable you to make informed and confident comparisons and decisions over time.
- Attention to detail. We do not skip over things during our review process. We thoroughly review and evaluate all that pertains to the security, privacy, performance, features, usability, and pricing of each product.
- User-centric approach. We prioritize our audience’s needs and feedback to deliver relevant and actionable reviews. User experiences are an integral part of our evaluations – we seek to align our recommendations with real-world, real-user requirements.
- Up-to-date insights. We keep up and stay in line with the latest developments in the cybersecurity landscape and ensure our reviews reflect current trends and advancements in cybersecurity software. By providing up-to-date insights, we help you make decisions based on the latest available information and stay ahead of emerging cybersecurity challenges, such as recent breaches, changes in privacy policies, product ownership, and more.
How do we score the products?
On Cybernews, you can see that all tested products have a rating from 1 to 5 stars. The higher the rating, the better the product, according to our testing criteria. Here’s what each score means:
|An excellent product, with minor flaws that don’t hinder the performance. The star performer in its category. Highly recommend buying it.
|4 stars ★★★★
|A good product, with a few flaws, but secure and with reliable enough performance.
|3 stars ★★★
|Acceptable product with some more noticeable flaws that may impact the security or performance. Buy at your own risk.
|2 stars ★★
|Bad product. The drawbacks seriously outweigh the benefits – it is not worth your money. Avoid and choose something else.
|1 star ★
|Has major flaws in security and performance, possibly unusable. We can’t recommend it at all.
Our testing process
We have the same standardized practice for all our product reviews. Here are the main steps we take when evaluating and testing different services we review:
- Software testing. We test the most important software features (RTP, on-demand scans, firewall for antivirus, leak protection, encryption, speeds for VPNs, etc.) and evaluate the performance and capabilities. We also test additional features and what value they bring to the product, especially when compared to the pricing.
- Security. We investigate the product's security systems, namely encryption ciphers, tunneling protocols, authentication methods, real-time threat detection and prevention effectiveness, and more.
- Privacy documentation and provider history. We look at the provider's privacy policies and how they handle users' data privacy and safety. We look into provider history, reputation, past failures, audits, and rewards. We use privacy policies, terms of services, audit reports, bug bounty programs, independent investigations, breach reports, etc. as sources.
- User reviews. We consider users' reviews on various platforms and check for potential software issues or working functionalities. If available, we check the product’s bug fix logs and history to see how well the provider reacts and fixes flaws in its software.
- User experience and use cases. Finally, we look into our overall experience while using this service/software. We also evaluate the product apps considering different user perspectives and use cases, namely, if it is more suitable for beginners, more experienced users, or specific use scenarios, such as products for specific devices and purposes.
Based on the nature of the review, we test the products on different devices. Here’s what we use:
- Windows – the Virtualbox VM with Windows 11 (with 10 GB RAM, 120 GB dynamic HDD, and 2 CPU cores allocated) installed on the Lenovo ThinkPad T14s Gen2 device (AMD Ryzen 5 PRO 5650U processor, 16 GB RAM, 256 GB SSD, Windows 11 Pro OS).
- macOS – MacBook Air 7.2 1.8 GHz Dual-Core Intel Core i5, 8GB RAM. With 250GB storage, out of which 81GB is free. The Virtualbox VM: 4 processors with 3.81 Ghz, 16 GB RAM, and 215GB storage, out of which 182GB is free.
- Android – Nokia 6.2 (model TA-1198), Android version 11, 64GB internal storage, 4GB RAM, CPU Octa-Core (4x1.8 GHz Kryo 260 Gold & 4x1.6 GHz Kryo 260 Silver).
- iOS – iPhone 7 (model MN8X2), iOS version 15.7.1., 32GB internal storage, 2GB RAM, CPU Quad-core 2.34 GHz (2x Hurricane + 2x Zephyr).
At Cybernews, we have an in-house testing team of security researchers, writers, and leading technology experts who review and evaluate all of our products. Our security experts and product reviewers work hand in hand, conducting real-life tests and extensive security research to provide an analysis of each provider.
The research team conducts regular tests to ensure all product changes and differences are accounted for in our articles.
We continuously update the information about the products we review to keep our information consistent and maintain high standards. If you’ve spotted an inconsistency in our reviews, contact us here.