Computer-Parallel Computing MCQ

Computer-Parallel Computing MCQ
11. In shared Memory
  • Multiple processors can operate independently but share the same memory resources
  • Multiple processors can operate independently but do not share the same memory resources
  • Multiple processors can operate independently but some do not share the same memory resources
  • None of these
Show Answer
12. In designing a parallel program, one has to break the problem into discreet chunks of work that can be distributed to multiple tasks. this is known as
  • Decomposition
  • Partitioning
  • Compounding
  • Both A and B
Show Answer
13. Latency is
  • Partitioning in that the data associated with a problem is decompose (D) Each parallel task then works on a portion of the dat (A)
  • Partitioning in that the focus is on the computation that is to be performed rather than on the data manipulated by decomposed according to the work that must be done. Each task then performs a portion of the overall work
  • It is the time it takes to send a minimal (0 byte) message from one point to other point
  • None f these
Show Answer
14. Domain Decomposition
  • Partitioning in that the data associated with a problem is decompose (D) Each parallel task then works on a portion of the dat (A)
  • Partitioning in that, the focus is on the computation that is to be performed rather than on the data manipulated by the computatin. The problem is decomposed according to the work that must be done. Each task then performs a portion of the overall work.
  • It is the time it takes to send a minimal (0 byte) message from point A to point (B)
  • None of these
Show Answer
15. Functional Decomposition:
  • Partitioning in that the data associated with a problem is decompose (D) Each parallel task then works on a portion of the dat (A)
  • Partitioning in that, the focus is on the computation that is to be performed rather than on the data manipulated by the computation.The problem is decomposed according to the work that must be done. Each task then performs a portion of the overall work.
  • It is time it takes to send a minimal (0 byte) message from point A to point (B)
  • None of these
Show Answer
16. Functional Decomposition :
  • Partitioning in that the data associated with a problem is decompose (D) Each parallel task then works on a portion of the dat (A)
  • Partitioning in that, the focus is on the computation that is to be performed rather than on the data manipulated by the computation. The problem is decomposed according to the work that must be done. Each task then performs a portion of the overall work.
  • It is the time it takes to send a minimal (0 byte) message from point A to point (B)
  • None of these
Show Answer
17. Synchronous Communications
  • It require some type of "handshaking" between tasks that are sharing dat (A) This can be explicitly structured in code by the programmer, or it may happen at a lower level unknown to the programmer.
  • It involves data sharing between more than two tasks, which are often specified as being members in a common group, or collective
  • It involves two tasks with one task acting as the sender/producer of data, and the other acting as the receiver/consumer.
  • It allows tasks to t ransfer data independently from one another
Show Answer
18. Collective communication
  • It involves data haring between more than two tasks, which are often specified as being members in a common group. or collective
  • It invove two tasks with one task acting as the sender/producer of data, and the other acting as the receiver/consumer.
  • It allows tasks t transfer data independently from the another
  • None of these
Show Answer
19. Point-to-point communication referred to
  • It involves data sharing between more than two tasks, which are often specified as being members in a common group. or collective
  • It involves two tasks with one task acting as the sender/producer of data, and the other acting as the receiver/consumer.
  • It allows tasks to transfer data independently from one another.
  • None of these
Show Answer
20. Uniform Memory Access (UMA) referredto
  • Here all processors have equal access and access times to memory
  • Here if one processor updates a location in shared memory, all the other processors know about the update
  • Here one SMP can directly access memory of another SMP and not all processors have equal access time to all memories
  • None of these
Show Answer
Questions and Answers for Competitive Exams Various Entrance Test