ISC2 CCSP Exam Questions

Page 8 of 50

141.

Identifying ownership, limitations on distribution, and similar information is part of which of the following?

  • Data labeling

  • Data dispersion

  • Data flow diagram

  • Data mapping

Correct answer: Data labeling

Data dispersion is when data is distributed across multiple locations to improve resiliency. Overlapping coverage makes it possible to reconstruct data if a portion of it is lost.

A data flow diagram (DFD) maps how data flows between an organization’s various locations and applications. This helps to maintain data visibility and implement effective access controls and regulatory compliance.

Data mapping identifies data requiring protection within an organization. This helps to ensure that the data is properly protected wherever it is used.

Data labeling contains metadata describing important features of the data. For example, data labels could include information about ownership, classification, limitations on use or distribution, and when the data was created and should be disposed of.

142.

Licensing is a concern that is MOST related to which of the following?

  • Third-Party Software

  • API Security

  • Supply Chain Security

  • Open Source Software

Correct answer: Third-Party Software

Some important considerations for secure software development in the cloud include:

  • API Security: In the cloud, the use of microservices and APIs is common. API security best practices include identifying all APIs, performing regular vulnerability scanning, and implementing access controls to manage access to the APIs.
  • Supply Chain Security: An attacker may be able to access an organization’s systems via access provided to a partner or vendor, or a failure of a provider’s systems may place an organization’s security at risk. Companies should assess their vendors’ security and ability to provide services via SOC2 and ISO 27001 certifications.
  • Third-Party Software: Third-party software may contain vulnerabilities or malicious functionality introduced by an attacker. Also, the use of third-party software is often managed via licensing, with whose terms an organization must comply. Visibility into the use of third-party software is essential for security and legal compliance.
  • Open Source Software: Most software uses third-party and open-source libraries and components, which can include malicious functionality or vulnerabilities. Developers should use software composition analysis (SCA) tools to build a software bill of materials (SBOM) to identify any potential vulnerabilities in components used by their applications.

143.

Generally Accepted Privacy Principles (GAPP) is a standard consisting of many privacy principles, one of which is regarding the utilization of information that is collected by an organization. What does the use principle say?

  • The organization can utilize information for only the purpose for which it was collected and only for a limited amount of time. At the end of that time, it must be disposed of appropriately and permanently.

  • The organization can utilize information for the purpose for which it was collected and within expected use beyond that. They are allowed to store the information in archival status for up to 50 years.

  • The organization can utilize the information for anything except offering it for sale. They must notify the customer of its imminent deletion, so the customer can opt back in if they choose.

  • The organization can utilize the information for the original stated purpose and for the following seven years. At the end of that time, it must be disposed of appropriately and permanently.

Correct answer: The organization can utilize information for only the purpose for which it was collected and only for a limited amount of time. At the end of that time, it must be disposed of appropriately and permanently.

The Generally Accepted Privacy Principles (GAPP) is a standard that consists of 10 key principles. The use, retention, and disposal principle states that use of personal information is limited to the purposes for which it was collected in the notice the individual consented to. The organization can then retain that information only for as long as it is needed to fulfill that purpose. At the end of that time, it must be disposed of appropriately and permanently.

144.

NIcole is looking into different public cloud providers for her business. The corporation is looking for the best Platform as a Service (PaaS) vendor to begin moving their servers and applications to. NIcole and her team have been looking at the different audits and certifications that each provider has pursued successfully. 

What standard would they look for if they are interested in the cloud provider demonstrating compliance with international recommendations for information security?

  • ISO/IEC 27017 (International Standards Organization / International Electrotechnical Committee)

  • ISO/IEC 27018 (International Standards Organization / International Electrotechnical Committee)

  • AICPA SOC 2 Type II (American Institute for Certified Public Accountants Security Organization Controls)

  • NIST SP 800-37 (National Institute of Standards and Technology Special Publication)

Correct answer: ISO/IEC 27017 (International Standards Organization / International Electrotechnical Committee)

ISO/IEC 27017 is an international standard that provides guidelines and best practices for information security controls specific to cloud service providers (CSPs) and cloud-based services. It focuses on addressing the unique security risks and challenges associated with cloud computing environments.

ISO/IEC 27018 is an international standard that provides guidelines and best practices for protecting personally identifiable information (PII) in public cloud environments. It specifically addresses the privacy concerns and requirements related to the processing of PII by cloud service providers (CSPs).

AICPA SOC 2 Type II is a framework for evaluating and reporting on the controls and processes of service organizations to ensure the security, availability, processing integrity, confidentiality, and privacy of their systems and data. It is an internationally recognized standard developed by the American Institute of Certified Public Accountants (AICPA) and is commonly used to assess the trustworthiness of service providers.

NIST SP 800-37, titled "Guide for Applying the Risk Management Framework to Federal Information Systems: A Security Life Cycle Approach," is a publication by the National Institute of Standards and Technology (NIST) that provides guidance on implementing a risk management framework for federal information systems.

145.

Which of the following cloud data encryption solutions is MOST likely to use a customer-controlled key?

  • Volume-level encryption

  • Storage-level encryption

  • Object-level encryption

  • Virtual HSM

Correct answer: Volume-level encryption

Data can be encrypted in the cloud in a few different ways. The main encryption options available in the cloud are:

  • Storage-Level Encryption: Data is encrypted as it is written to storage using keys known to/controlled by the CSP.
  • Volume-Level Encryption: Data is encrypted when it is written to a volume connected to a VM using keys controlled by the cloud customer.
  • Object-Level Encryption: Data written to object storage is encrypted using keys that are most likely controlled by the CSP.
  • File-Level Encryption: Applications like Microsoft Word and Adobe Acrobat can encrypt files using a user-provided password or a key controlled by an IRM solution.
  • Application-Level Encryption: An application encrypts its data using keys provided to it before storing the data (typically in object storage). Keys may be provided by the customer or CSP.
  • Database-Level Encryption: Databases can be encrypted at the file level or use transparent encryption, which is built into the database software and encrypts specific tables, rows, or columns. These keys are usually controlled by the cloud customer.

Hardware security modules (HSMs) are a type of hardware used for secure management of keys and sensitive data. Cloud service providers offer virtual HSMs for cloud environments. 

146.

Adriaan works for a large pharmaceutical company as their information security officer. He is working with the cloud data architect to plan the move of a critical server and its associated data. The data will be stored as big data at one of the large public cloud providers. The step that they are currently working on is the movement of the data to the cloud. Since this is actually about one petabyte, it is a great deal of effort to get the data to the cloud servers.  They have chosen to use the physical device transfer option. 

How should they protect the data in transit from the on-premises data center to the cloud data center?

  • Encrypt the data at rest with the Advance Encryption Standard (AES)

  • Encrypt the data in transit with Transport Layer Security (TLS)

  • Encrypt the data in transit with Internet Protocol Security (IPSec)

  • Encrypt the data in use with Fully Homomorphic Encryption (FHE)

Correct answer: Encrypt the data at rest with the Advance Encryption Standard (AES)

This is a physical drive being taken physically from the on-prem data center by car/truck/van to the cloud data center. This is data at rest. So, using AES is a great answer.

Data in transit means that it is moving across the network using wireless, wire, or fiber connection. If that was the case here, either TLS or IPSec would be great.

Data in use means that it is being processed. If it was in use and there was an FHE that worked for that application, that would be optimal.

147.

Which of the following refers to a cloud customer's ability to grow or shrink their cloud footprint on demand?

  • Elasticity

  • Scalability

  • Agility

  • Mobility

Correct answer: Elasticity

  • Elasticity refers to a system’s ability to grow and shrink on demand.
  • Scalability refers to its ability to grow as demand increases.
  • Agility and mobility are not terms used to describe cloud environments.

148.

A cloud security engineer working for a financial institution needs to determine how long specific financial records must be stored and preserved. Which of the following specifies how long financial records must be preserved?

  • Sarbanes-Oxley (SOX)

  • General Data Protection Regulation (GDPR)

  • Privacy Act of 1988

  • Gramm-Leach-Bliley Act (GLBA)

Correct answer: Sarbanes-Oxley (SOX)

The Sarbanes-Oxley Act (SOX) regulates how long financial records must be kept. SOX is enforced by the Securities and Exchange Commission (SEC). SOX was passed as a way to protect stakeholders and shareholders from improper practices and errors. This was passed as a result of the fraudulent reporting by Enron.

GDPR is the European Union's (EU) requirement for member countries, including Germany, to protect personal data in their possession. 

GLBA is an extension to SOX, which requires personal data to be protected for the customers of the business that must be in compliance with SOX.

The privacy act of 1988 is an Australian law that requires personal data to be protected.

149.

A cloud service provider is building a new data center to provide options for companies that are looking for private cloud services. They are working to determine the size of datacenter that they want to build. The Uptime Institute created the Data Center Site Infrastructure Tier Standard Topology. With this standard, they created a few levels of data centers. The cloud provider has a goal of reaching tier three.

How is that characterized in general?

  • Concurrently Maintainable

  • Fault Tolerance

  • Basic Capacity

  • Redundant Capacity Components 

Correct answer: Concurrently Maintainable

The Uptime Institute publishes one of the most widely used standards on data center tiers and topologies. The standard is based on four tiers, which include:

  • Tier I: Basic Capacity
  • Tier II: Redundant Capacity Components
  • Tier III: Concurrently Maintainable
  • Tier IV: Fault Tolerance

150.

Nur, a systems administrator at Acme Inc., is using a REST API to configure a system. Which HTTP verb should Nur use to replace data at an existing API endpoint?

  • PUT

  • GET

  • UPDATE

  • POST

Correct answer: PUT

Common HTTP verbs include:

  • POST: creates an object
  • GET: Retrieves an object
  • DELETE: Deletes an object
  • PATCH: Modifies an object
  • PUT: Updates an object or replaces data

UPDATE is not an HTTP verb.

151.

Which of the following types of SOC reports could include an extended assessment of the effectiveness of an organization's security controls?

  • SOC 2 Type II

  • SOC 2 Type I

  • SOC 1  Type I

  • SOC 3 Type II

Correct answer: SOC 2 Type II

Service organization control (SOC) reports are generated by the American Institute of CPAs (AICPA). The three types of SOC reports are:

  • SOC 1: SOC 1 reports focus on financial controls and are used to assess an organization’s financial stability.
  • SOC 2: SOC 2 reports assess an organization's controls in different areas, including security, availability, processing integrity, confidentiality, or privacy. Only the security area is mandatory in a SOC 2 report.
  • SOC 3: SOC 3 reports provide a high-level summary of the controls that are tested in a SOC 2 report but lack the same detail. SOC 3 reports are intended for general dissemination.

SOC 2 reports can also be classified as Type I or Type II. A Type I report is based on an analysis of an organization’s control designs but does not test the controls themselves. A Type II report is more comprehensive, as it tests the effectiveness and sustainability of the controls through a more extended audit. 

Type I and Type II do not apply to SOC 3. 

152.

Which cloud characteristic is discussed when the Central Processing Unit (CPU), memory, network capacity, and other things are allocated to customer virtual machines as they are needed?

  • Resource pooling

  • On-demand self-service

  • Interoperability

  • Broad network access

Correct answer: Resource pooling

When the resources, such as CPU, memory, and network, are gathered into a pool and allocated to running virtual machines as needed, resource pooling is being discussed.

On-demand self-service is when the cloud provider (private, public, or community) provides a web interface to navigate offerings and purchase them, such as AWS.amazon.com.

Interoperability is the ability for two different systems to be able to share a piece of data and use it, such as a word doc being created on a Mac and sent to a Windows device and be readable.

Broad network access is the requirement of the cloud, which means that network access must be there, and when it is there, the customer can use the cloud.

153.

The government's Unclassified/Confidential/Secret/Top Secret system classifies data based on which of the following?

  • Sensitivity

  • Type

  • Ownership

  • Criticality

Correct answer: Sensitivity

Data owners are responsible for data classification, and data is classified based on organizational policies. Some of the criteria commonly used for data classification include:

  • Type: Specifies the type of data, including whether it has personally identifiable information (PII), intellectual property (IP), or other sensitive data protected by corporate policy or various laws.
  • Sensitivity: Sensitivity refers to the potential results if data is disclosed to an unauthorized party. The Unclassified, Confidential, Secret, and Top Secret labels used by the U.S. government are an example of sensitivity-based classifications.
  • Ownership: Identifies who owns the data if the data is shared across multiple organizations, departments, etc.
  • Jurisdiction: The location where data is collected, processed, or stored may impact which regulations apply to it. For example, GDPR protects the data of EU citizens.
  • Criticality: Criticality refers to how important data is to an organization’s operations.

154.

During which type of testing do legitimate testers perform actions that are similar to the actions a threat actor would take? 

  • Abuse case

  • Attacker simulation

  • SAST

  • Gray-box 

Correct answer: Abuse case testing

Abuse case testing involves legitimate testers or automated tooling performing actions that are outside of the expected use cases for a system and more aligned with attacker behavior (e.g., malicious or malformed inputs). 

Attacker simulation testing is a distractor answer and is not a standard type of testing. 

Static application security testing is used to scan source code for issues such as security vulnerabilities. 

Gray-box testing involves a tester who has some insider knowledge about a system, but not the source code, testing it. 

155.

Which of the following about SOAP vs. REST APIs is TRUE?

  • SOAP APIs are slower than REST APIs

  • REST API are reliant on XML, but SOAP APIs are not

  • SOAP APIs are more efficient than REST APIs

  • REST APIs are a better choice when stateful operations are used

Correct answer: SOAP APIs are slower than REST APIs

Simple object access protocol (SOAP) and RESTful application programming interfaces (APIs) are both common ways to programmatically interact with many cloud-based systems. 

SOAP APIs are:

  • Based on standards
  • Depend heavily on XML formatting
  • Slower than RESTful APIs
  • Intolerant of errors

RESTful APIs are:

  • Typically more lightweight, efficient, and faster than SOAP APIs
  • Preferred for stateless operations
  • Capable of using multiple output formats including XML and JSON

156.

Mia works for a network hardware and software development company. She is currently working on setting up the team that will be testing one of their new products. This particular piece of software is the Operating System (OS) of a network router. 

When conducting functional testing, which is NOT an important consideration?

  • Testing must use limited information about the application

  • Testing must be realistic for all environments

  • Testing must be designed to exercise all requirements

  • Testing must be sufficient to have reasonable assurance there are no bugs

Correct answer: Testing must use limited information about the application

Testing that must use limited information about the application is called grey box testing and occurs after functional testing and deployment.

Functional testing is performed on an entire system, and the following are important considerations: Testing must be realistic, must exercise all requirements, and be bug free.

157.

Which of the following organizations published the Egregious Eleven to describe the most common cloud vulnerabilities?

  • CSA

  • OWASP

  • SANS

  • NIST

Correct answer: CSA

Several organizations provide resources designed to teach about the most common vulnerabilities in different environments. Some examples include:

  • Cloud Security Alliance Top Threats to Cloud Computing: These lists name the most common threats, such as the Egregious 11. According to the CSA, the top cloud security threats include data breaches; misconfiguration and inadequate change control; lack of cloud security architecture and strategy; and insufficient identity, credential access, and key management.
  • OWASP Top 10: The Open Web Application Security Project (OWASP) maintains multiple top 10 lists, but its web application list is the most famous and is updated every few years. The top four threats in the 2021 list were broken access control, cryptographic failures, injection, and insecure design.
  • SANS CWE Top 25: SANS maintains a Common Weakness Enumeration (CWE) that describes all common security errors. Its Top 25 list highlights the most dangerous and impactful weaknesses each year. In 2021, the top four were out-of-bounds write, improper neutralization of input during web page generation (cross-site scripting), out-of-bounds read, and improper input validation.

NIST doesn't publish regular lists of top vulnerabilities.

158.

Which of the following types of testing verifies that individual functions or modules work as intended?

  • Unit Testing

  • Integration Testing

  • Usability Testing

  • Regression Testing

Correct answer: Unit Testing

Functional testing is used to verify that software meets the requirements defined in the first phase of the SDLC. Examples of functional testing include:

  • Unit Testing: Unit tests verify that a single component (function, module, etc.) of the software works as intended.
  • Integration Testing: Integration testing verifies that the individual components of the software fit together correctly and that their interfaces work as designed.
  • Usability Testing: Usability testing verifies that the software meets users’ needs and provides a good user experience.
  • Regression Testing: Regression testing is performed after changes are made to the software and verifies that the changes haven’t introduced bugs or broken functionality.

Non-functional testing tests the quality of the software and verifies that it provides necessary functionality not explicitly listed in requirements. Load and stress testing or verifying that sensitive data is properly secured and encrypted are examples of non-functional testing.

159.

Which of the following "stripes" data across multiple drives on different servers in a way that allows lost data to be recovered if a drive fails?

  • Redundant array of independent disks (RAID)

  • Storage cluster

  • Storage area network (SAN)

  • Object storage

Correct answer: Redundant array of independent disks (RAID)

RAID stores data across multiple different drives using a process called striping. This enables the data to be recovered even if one of the disks fails.

Storage clusters are a set of computers that are used to provide more reliable, resilient, and available storage.

Storage area networks (SANs) and network-attached storage (NAS) provide access to storage devices over the network.

Object storage is a popular form of cloud storage used by services such as Amazon S3 and Azure Blob storage. 

160.

A DLP solution is inspecting the contents of an employee's email. What stage of the DLP process is it MOST likely at?

  • Monitoring

  • Discovery

  • Mapping

  • Enforcement

Correct answer: Monitoring

Data loss prevention (DLP) solutions are designed to prevent sensitive data from being leaked or accessed by unauthorized users. In general, DLP solutions consist of three components:

  • Discovery: During the Discovery phase, the DLP solution identifies data that needs to be protected. Often, this is accomplished by looking for data stored in formats associated with sensitive data. For example, credit card numbers are usually 16 digits long, and US Social Security Numbers (SSNs) have the format XXX-XX-XXXX. The DLP will identify storage locations containing these types of data that require monitoring and protection.
  • Monitoring: After completing discovery, the DLP solution will perform ongoing monitoring of these identified locations. This includes inspecting access requests and data flows to identify potential violations. For example, a DLP solution may be integrated into email software to look for data leaks or monitor for sensitive data stored outside of approved locations.
  • Enforcement: If a DLP solution identifies a violation, it can take action. This may include generating an alert for security personnel to investigate and/or block the unapproved action.

Mapping is not a stage of the DLP process.