We would like to get your opinion on which Open Technologies you are familiar with, would like to know more about, a poll on more popular technology stacks vs some you might want to know about, following the ones which we think should be getting your attention.
Cloud native is a way to build and run applications taking full advantage of cloud computing technologies. It involves building an application based on a collection of microservices deployed independently and scaled horizontally to meet demand. This provides more flexibility as developers can modify certain services as needed, rather than replacing the entire application. It also provides enhanced scalability because the group can easily spin up or remove containers in response to traffic needs, reducing costs. Also, applications are deployed across multiple servers or nodes meaning that a failure in one part may not affect the entire system. An article from thenewstack brings more description to the subject, identifying four pillars of the cloud native concept: Microservices, DevOps, open standards, and containers.
Microservices are important for cloud-based architectures as they offer many advantages such as scalability, durability, and agility. Microservices are smaller and more focused than monolithic applications; this makes them easy to build, test and deploy. This enables teams to act faster and respond more quickly to business changes and application requirements.
DevOps is another important aspect of cloud-native architecture, as it enables teams to deploy and update applications quickly through agile processes.
Open-source frameworks (like Kubernetes) provide a way for building cloud-native applications and enable interoperability between different products.
Containers and their orchestration provide a lightweight and portable way to package and run applications, allowing them to be easily moved between different locations.
In April, the KubeCon + CloudNativeCon Europe 2023 held in Amsterdam, the Netherlands, was one of the largest open-source conferences ever, with over 12,000 attendees. Together with valuable insights, the event confirmed once again that Kubernetes is the centre of cloud-native computing. An interesting article on Silverlinings from Steven Vaughan Nichols describes 5 highlights from the show, and we summarise them here:
Demand for Experts Cloud-native professionals are in high demand; Every company in the market is hiring Kubernetes experts, from start-ups to Fortune 50 companies like IBM, Microsoft, and AWS. SUSE and KubeCampus.io also launched new Kubernetes training programs to fill the shortage of cloud-native programmers, DevOps, security professionals, and administrators.
K8s 1.27 Released before KubeCon, Kubernetes 1.27 includes a few changes; two of them are most important for anyone serious about using Kubernetes: a new community image registry, register.k8s.io, replaces the old name k8s.gcr.io and SeccompDefault is now stable, a property that makes Kubernetes containers more secure by limiting the effective process to a certain number of system calls.
Automationfor Security The use of automation in software supply chain security is gaining popularity due to the lack of qualified security personnel. Companies use software bill of materials (SBOM) and Supply-chain Levels for Software Artifacts (SLSA) tools to protect software products by building security in integration and deployment (CICD) pipelines.
SLSA v1.0 Open Source Security Foundation (OpenSSF) president Brian Behlendorf said that the stable release of SLSA v1.0 is an important step in improving the security of software products. It provides developers with the tools necessary to measure high dependencies, measure the reliability of legacy products, and measure compliance efforts in the future Secure Software Development Framework (SSDF).
Generative AI & Cloud Generative AI and the cloud are also a highlight at KubeCon, where programs like ChatGPT that use generative AI for speech and emerging with cloud-native computing are discussed. Some companies, such as Microsoft, are already exploring the use of generative AI to optimise container workloads and reduce costs by predicting the optimal resources needed to run applications.
With a post on the Azure blog, Microsoft announced that ChatGPT is now available in preview in Azure OpenAI Service. GPT-3.5 will be included in the set of AI models available (Dall-E 2, Codex, etc.) to help customers innovate in new ways and in different scenarios with the power of Artificial Intelligence such as generating suggested email copy or helping with software programming questions. Developers will now be able to integrate custom AI-powered experiences directly into their applications.
The Azure OpenAI Service has become a popular choice among customers to apply advanced AI models, and introducing ChatGPT will probably increase this popularity. The post also highlights how organisations are already utilising Azure OpenAI and ChatGPT to achieve business value. Microsoft is also leveraging AI models in Azure OpenAI Service to introduce new experiences in its products, such as GitHub Copilot and Microsoft Teams Premium.
Microsoft also discusses how AI, specifically generative models like ChatGPT and DALL-E, can have a positive impact on productivity and creativity but also pose challenges in terms of trust and responsible use. Microsoft has implemented a set of mitigations at multiple levels, including customer-controlled application protections, technical input/output filtering, process and policy protections, and transparent documentation. Their commitment is to a principled approach to AI development to ensure that AI systems are used responsibly for the benefit of humanity.
AI solutions will become more and more popular and their adoption in the cloud will become a natural step forward. OpenCloudification will include this and other relevant topics during the development of the project.
For the past eight years, the Cloud Native Computing Foundation has utilised its pivotal role in the cloud-native community to survey the technology landscape, comprehend its workings, and cater better to the users of open-source cloud-native technologies. The worldwide survey has been conducted from June to September 2022, reaching 2,063 participants. The survey was conducted in English, Chinese, and Japanese and was promoted through various channels such as social media and email newsletters. The questions in the survey were more nuanced this year, differentiating between production usage and selected usage, and also included anonymous production data from Buoyant, Datadog, and Dynatrace to provide real-world insight into CNCF project usage. The results of the survey provide a comprehensive understanding of the cloud native community, their challenges, and benefits from using cloud native technologies. The report is global in scope, covering 6 continents, and equally including organisations of all sizes.
It is clear that Kubernetes has established itself as a widely adopted and powerful platform for organizations and its versatility and ability to accommodate a range of workloads has solidified its position as the “operating systemof the cloud“. 44% of respondents are already using containers for most applications and business segments, and another 35% are using containers for a few production applications. However, the adoption of cloud native techniques is still in its early stages, as only 30% of respondents’ organizations have adopted it across nearly all development and deployment activities. Despite this, 62% of organizations that do not regularly use cloud native techniques have containers for pilot projects or limited production use cases, indicating potential for growth. According to the 2022 Container Report by Datadog (Datadog, 9 Insight on Real-World Container Use), nearly half of all organizations using containers run Kubernetes to deploy and manage some of those containers. Additionally, organizations are more likely to adopt a multi-cloud approach as they grow in size.
In general, the adoption of Cloud Native Computing Foundation (CNCF) incubated and graduated projects continued to grow. OpenTelemetry, a project for collecting and analyzing telemetry data from applications and infrastructure, saw a significant increase in usage, rising from 4% in 2020 to 20% in 2022. Argo, a GitOps-based continuous delivery and multi-cloud serverless platform, also saw a significant increase in usage, growing from 10% to 28%. Containerd, the popular open-source runtime for building and packaging Docker containers, and CoreDNS, a DNS server with a focus on service discovery, both saw a significant increase in use and evaluation. Containerd usage increased from 36% to 56% while CoreDNS usage increased from 48% to 56%. Serverless architecture/FaaS also moved from 30% to 53%, also showing that 37% of end-user organizations have some experience deploying applications with WebAssembly, with WasmEdge and WAMR being the most commonly used runtimes. The use of service meshes increased from 36% in 2021 to 45% in 2022, demonstrating that organizations are looking to further automate and optimize communication between microservices. The use of cloud native observability tools, such as Prometheus and Grafana, also rose significantly in 2022, with 57% of organizations now using them to monitor and troubleshoot their Kubernetes deployments
These trends indicate that organizations are increasingly looking to adopt and integrate advanced technologies into their cloud-native ecosystems. CNCF projects are widely recognized as being best-of-breed, production-ready solutions that help organizations address the challenges of deploying, managing and scaling cloud-native applications. The widespread adoption of open-source projects also suggests that organizations are looking for cost-effective and flexible solutions that can be customized to meet their unique needs. Lack of trainingwas reported as the biggest challenge in using and deploying containers. This challenge was cited by 44% of organizations who have not yet deployed containers in production and by 41% of organizations who use containers on a limited basis. As organizations increase their use of containers for nearly all applications, the challenge of security becomes the top concern. These findings suggest that organizations need to invest in training and security measures in order to fully adopt and benefit from container technology.
Best practices cases webinar by OpenCloudification for use of cloud and open cloud technologies for technology and manufacturing companies. Part of the OpenCloudification.com series supporting ICT and manufacturing companies embracing cloud, open cloud technologies and digitally transforming. Some of the following topics were handled : why would a technology company or manufacturing company consider use of cloud and open cloud, technology and manufacturing use cases explained, the basic business cases for technology and manufacturing companies, make your own, some first steps, what to prepare for cloudification and digitization, …
The “Cloud and Threat Report” for 2022 from Netskope Threat Labs highlights a cloud malware delivery increase in cloud environments, with Microsoft OneDrive leading the charts as the origin of the majority of cloud malware downloads, together with phishing, scams, credit card skimmers, exploit kits, and other malicious web content, emphasizing the importance of inspecting all content from all destinations for both web and cloud. Netskope detected malware downloads from 401 distinct cloud apps in 2022. The report states that the percentage of malware downloads increased, ending the year ten points higher than in 2021. In the last year, 48% of HTTP/HTTPS malware downloads originated from cloud apps, whereas 30% of all cloud malware downloads originated from Microsoft OneDrive, which is a reflection of attacker tactics, user behaviour, and company policy. By industry vertical, the largest increases in cloud malware downloads occurred in healthcare, manufacturing, and telecom. Per specific sector, Google Drive takes the top spot in retail and Azure Blob Storage leads in healthcare. The majority of malicious web content is hosted on a variety of different types of sites. The top ten categories include uncategorized sites and marketing sites, which account for only 13.6% of the total malicious web content access. Attackers have been populating their websites with enough content to make them seem legitimate, and only using them to host malicious content after they have been around long enough to blend in. They have also been abusing free hosting services and compromising existing websites to deliver malicious content.
Keep Your Cloud Protected
Several measures for organizations to protect themselves against cloud-delivered malware and malicious web content. These include:
Deploy multi-layered, inline threat protection for all cloud and web traffic to block inbound malware and outbound malware communications.
Enforce granular policy controls to limit data flow, including flow to and from apps, between the company and personal instances, among users, and to and from the web, adapting the policies based on device, location, and risk.
Deploy cloud data protection to limit the movement of sensitive data, including preventing its movement to unauthorized devices, apps, and instances.
Invoke real-time coaching to users to use safer app alternatives to protect data, justify unusual data activity, and provide step-up authentication for risky conditions within business transactions.
Reduce browsing risk for newly registered domains, newly observed domains, uncategorized websites, and other security risk categories by using remote browser isolation (RBI).
Mitigate the risk of stolen credentials by enabling multi-factor authentication (MFA) and extending MFA to unmanaged apps via an identity service provider or Security Service Edge (SSE) platform.
Use behavioural analytics to detect compromised accounts, compromised devices, and insider threats.
Enable zero trust principles for least privilege access to data with continuous monitoring and reporting to uncover unknown risks using a closed loop to then further refine access policies.
Cloud repatriation refers to the process of moving data and applications back to traditional, on-premises systems from a public cloud provider. This trend is usually driven by the high costs and complexity of cloud computing, as many enterprises have found that their cloud bills are higher than expected. This is often due to inefficiencies in their migrated applications and data. While refactoring these workloads to be more cost-efficient on the cloud can be time-consuming and expensive, repatriation may be a more cost-effective option in some cases, particularly for workloads that do not require advanced cloud-based services or can take advantage of the lower prices of traditional data center hardware. However, when deciding whether to repatriate workloads to a traditional data center or not, it is important to consider some variables, like dependencies on specialized cloud-based services or the potential cost savings from native cloud capabilities.
Some tips to avoid cloud repatriation and therefore save money and time can be taken. OpenCloudification can help, either if you are looking for the expertise or simply to double-check that you are following these steps correctly:
Plan and prepare the migration process: Take the time to thoroughly assess your current workloads and determine which ones are suitable for the cloud. Consider factors such as the workload’s performance and scalability requirements, as well as any dependencies on cloud services.
Refactor your applications and data for the cloud: This can involve redesigning your applications to take advantage of native cloud capabilities such as auto-scaling and security, as well as optimizing your data storage for the cloud.
Monitor and manage your cloud costs: Keep track of your cloud usage and expenses, and take steps to optimize your cost efficiency. This can involve turning off idle resources or using cost-effective storage options, for instance.
Review your cloud contract: Make sure you understand it and ensure that it aligns with your business needs.
Use a cloud management platform: It can help you automate and optimize your cloud usage, as well as provide visibility into your cloud costs and performance.
Cloudification offers many great opportunities, is the way to go, but requires some cybersecurity consideratoins. Learn from some of the mistakes and incidents that happened in the previous years : 1. failing to implement “any” security measures; 2. misconfiguration of Azure buckets; 3. management dashboard publicly available over the internet; 4. compromised employee accounts (to access development environment); 5. breaching cloud-based data networks; 6. misconfigured AWS S3 bucket; 7. unsecured Elasticsearch database with operational data; 8. private cloud infected with ransomware; 9. deployment error for AWS Analytics (by Amazon); 10. no password protection (and misconfigured AWS S3); … For more information go to the full articles : https://www.immuniweb.com/blog/top-10-cloud-security-incidents-in-2022.html
According to the report “The State of Zero Trust Transformation 2023” by cloud security firm Zscaler, more than 90% of IT leaders who have started migrating to the cloud have implemented or are planning to implement a zero-trust security architecture. This approach, which is based on the principle that no user, device, or application should be inherently trusted, is seen as the ideal framework for securing enterprise users, workloads, and IoT/OT environments in a highly distributed and mobile-centric world.
Implementing a zero-trust model can bring advantages, like reducing the risk of data breaches and other security incidents by requiring all access to be carefully verified and controlled. With a zero-trust model, organizations can grant access to resources and systems on an as-needed basis, rather than relying on a perimeter-based approach. This can be especially useful for organizations with distributed workforces or that need to grant access to external partners or vendors. In addition, a zero-trust model can help organizations to meet various compliance requirements, such as those related to data protection and privacy.
At the same time, implementing a zero-trust model can be complex, and may require significant investment in security infrastructure and resources. The added security measures required may lead to longer login times and other delays, which can impact productivity and users may find the additional security measures required by a zero-trust model to be inconvenient, particularly if they need to authenticate multiple times or use different methods to access different resources.
The report also found that IT leaders see security, access, and complexity as the top concerns in the cloud and that traditional VPNs and perimeter-based firewalls are ineffective at protecting against cyberattacks or providing visibility into application traffic and attacks.
It’s worth noting that the specific pros and cons of implementing this model will depend on an organization’s specific needs and priorities. Organizations should carefully consider their security requirements and resources when deciding whether to implement a zero-trust model.