- UNIT-I
.
Unit-1 MCQ's
Data Science and Machine Learning
Neural Networks: History MCQs
Early Foundations
1. Who is considered the father of artificial neural networks?
a) Alan Turing
b) Warren McCulloch
c) John von Neumann
d) Geoffrey Hinton
Answer: b) Warren McCulloch
2. In which year was the first artificial neuron model proposed?
a) 1936
b) 1943
c) 1956
d) 1986
Answer: b) 1943
3. The McCulloch-Pitts neuron model was based on which mathematical function?
a) Sigmoid function
b) Step function
c) ReLU function
d) Softmax function
Answer: b) Step function
4. The McCulloch-Pitts model was designed to:
a) Simulate biological neurons
b) Solve regression problems
c) Classify images
d) Optimize machine learning algorithms
Answer: a) Simulate biological neurons
5. The first perceptron model was developed by:
a) Warren McCulloch and Walter Pitts
b) Frank Rosenblatt
c) Marvin Minsky
d) Geoffrey Hinton
Answer: b) Frank Rosenblatt
6. In which year did Frank Rosenblatt develop the Perceptron model?
a) 1943
b) 1958
c) 1986
d) 2006
Answer: b) 1958
The Perceptron and Early Challenges
7. The Perceptron Algorithm was designed for:
a) Classification tasks
b) Regression tasks
c) Reinforcement learning
d) Unsupervised learning
Answer: a) Classification tasks
8. The Perceptron could not solve which type of problem?
a) Linearly separable problems
b) XOR problem
c) AND problem
d) OR problem
Answer: b) XOR problem
9. The book "Perceptrons" (1969) was written by:
a) Geoffrey Hinton and Yann LeCun
b) Warren McCulloch and Frank Rosenblatt
c) Marvin Minsky and Seymour Papert
d) Andrew Ng and Ian Goodfellow
Answer: c) Marvin Minsky and Seymour Papert
10. The Perceptron Limitation caused:
a) A decline in neural network research
b) Faster adoption of AI
c) Introduction of convolutional networks
d) More investment in deep learning
Answer: a) A decline in neural network research
11. The decline in neural network research in the 1970s and 1980s is known as:
a) AI Revolution
b) AI Winter
c) Deep Learning Boom
d) Perceptron Growth
Answer: b) AI Winter
Backpropagation and Multi-Layer Perceptrons
12. Which algorithm allowed neural networks to learn efficiently?
a) K-Means Clustering
b) Support Vector Machines
c) Backpropagation
d) Genetic Algorithms
Answer: c) Backpropagation
13. Backpropagation was introduced in the 1986 paper by:
a) Warren McCulloch
b) Yann LeCun
c) Geoffrey Hinton, Rumelhart, and Williams
d) Andrew Ng
Answer: c) Geoffrey Hinton, Rumelhart, and Williams
14. Backpropagation helped train which type of neural network?
a) Single-layer perceptron
b) Multi-layer perceptron (MLP)
c) Convolutional neural network (CNN)
d) Recurrent neural network (RNN)
Answer: b) Multi-layer perceptron (MLP)
Advancements in Neural Networks
15. In the 1990s, Yann LeCun developed which neural network for image recognition?
a) Recurrent Neural Network (RNN)
b) Convolutional Neural Network (CNN)
c) Deep Belief Network (DBN)
d) Hopfield Network
Answer: b) Convolutional Neural Network (CNN)
16. Neural networks became widely used again in:
a) 1970s
b) 1980s
c) 1990s
d) 2000s
Answer: d) 2000s
17. Which neural network architecture is best suited for time-series prediction?
a) Convolutional Neural Networks (CNN)
b) Multi-layer Perceptrons (MLP)
c) Recurrent Neural Networks (RNN)
d) Random Forest
Answer: c) Recurrent Neural Networks (RNN)
Deep Learning Era
18. The Deep Learning boom was driven by:
a) Increase in computational power
b) Large datasets
c) Advances in backpropagation
d) All of the above
Answer: d) All of the above
19. The ImageNet competition helped advance:
a) Reinforcement Learning
b) Support Vector Machines
c) Deep Learning
d) Bayesian Networks
Answer: c) Deep Learning
20. Which model achieved a breakthrough in 2012 in ImageNet?
a) LeNet-5
b) AlexNet
c) Deep Belief Networks
d) GPT-2
Answer: b) AlexNet
Modern Developments
21. The Transformer architecture was introduced in which year?
a) 2010
b) 2015
c) 2017
d) 2020
Answer: c) 2017
22. The Transformer model was developed by researchers at:
a) OpenAI
b) DeepMind
c) Google Brain
d) Facebook AI
Answer: c) Google Brain
23. Neural networks used for text generation are based on:
a) Convolutional Networks
b) Decision Trees
c) Transformer Models
d) Naïve Bayes
Answer: c) Transformer Models
Challenges and Future Trends
24. The main challenge in training deep neural networks is:
a) High computational cost
b) Large data requirements
c) Overfitting
d) All of the above
Answer: d) All of the above
25. Which of the following is a solution to the vanishing gradient problem?
a) Using ReLU activation
b) Increasing learning rate
c) Reducing layers
d) Using Naïve Bayes
Answer: a) Using ReLU activation
26. The "attention mechanism" is critical for:
a) Convolutional Networks
b) RNNs
c) Transformers
d) Naïve Bayes
Answer: c) Transformers
27. Neural networks today are widely used in:
a) Image recognition
b) Natural language processing
c) Autonomous driving
d) All of the above
Answer: d) All of the above
Artificial and Biological Neural Networks MCQs
Basic Concepts of Neural Networks
28. The main difference between artificial and biological neural networks is:
a) Artificial networks can learn faster
b) Biological networks use electrical and chemical signals
c) Artificial networks do not require data
d) Biological networks are slower but more efficient
Answer: b) Biological networks use electrical and chemical signals
29. Biological neurons communicate using:
a) Binary codes
b) Synaptic transmission
c) Machine learning algorithms
d) Logical gates
Answer: b) Synaptic transmission
30. In biological neurons, the space between two neurons where communication occurs is called:
a) Dendrite
b) Synapse
c) Axon
d) Soma
Answer: b) Synapse
31. Which of the following is not a part of a biological neuron?
a) Dendrites
b) Axon
c) Perceptron
d) Synapse
Answer: c) Perceptron
32. The primary function of dendrites in biological neurons is:
a) Transmitting signals to the next neuron
b) Receiving signals from other neurons
c) Storing information
d) None of the above
Answer: b) Receiving signals from other neurons
Comparison of Biological and Artificial Neural Networks
33. Artificial neurons are inspired by:
a) Logical circuits
b) Human brain neurons
c) Genetic algorithms
d) None of the above
Answer: b) Human brain neurons
34. In biological neurons, the axon is responsible for:
a) Receiving inputs
b) Processing signals
c) Transmitting signals to other neurons
d) Storing memories
Answer: c) Transmitting signals to other neurons
35. The activation function in an artificial neural network is similar to:
a) Neurotransmitter release in biological neurons
b) The nucleus of a biological neuron
c) The action potential in a neuron
d) DNA replication
Answer: c) The action potential in a neuron
36. Biological neurons process information using:
a) Mathematical functions
b) Logic gates
c) Electrical and chemical signals
d) CPU instructions
Answer: c) Electrical and chemical signals
37. The number of neurons in a typical human brain is approximately:
a) 10 million
b) 100 million
c) 86 billion
d) 1 trillion
Answer: c) 86 billion
38. Artificial Neural Networks (ANNs) are mainly used for:
a) Storing memory
b) Performing calculations
c) Pattern recognition and learning
d) None of the above
Answer: c) Pattern recognition and learning
Neural Network Architecture
39. In artificial neural networks, weights are similar to:
a) Synaptic strengths in biological neurons
b) Neuron count in the brain
c) DNA sequences
d) Action potentials
Answer: a) Synaptic strengths in biological neurons
40. The input layer in an artificial neural network corresponds to which part of a biological neuron?
a) Axon
b) Dendrites
c) Synapse
d) Myelin sheath
Answer: b) Dendrites
41. The output layer of an artificial neural network is similar to the:
a) Dendrites of a neuron
b) Axon of a neuron
c) Synapse of a neuron
d) Soma of a neuron
Answer: b) Axon of a neuron
42. In ANNs, a hidden layer is responsible for:
a) Directly taking inputs
b) Mapping inputs to outputs
c) Processing features and extracting patterns
d) Eliminating neurons
Answer: c) Processing features and extracting patterns
43. The function of synapses in biological neurons is most similar to:
a) Weights in artificial neural networks
b) Bias terms in neural networks
c) Loss functions
d) Activation functions
Answer: a) Weights in artificial neural networks
Learning and Adaptation
44. How do biological neurons "learn"?
a) By modifying synaptic connections
b) By increasing the number of neurons
c) By changing the brain structure
d) By generating new action potentials
Answer: a) By modifying synaptic connections
45. In artificial neural networks, learning occurs by adjusting:
a) Activation functions
b) Bias terms
c) Weights and biases
d) The number of layers
Answer: c) Weights and biases
46. Which of the following best describes Hebbian Learning in biological neurons?
a) "Neurons that fire together, wire together"
b) "A neuron cannot be activated twice"
c) "Neurons compete for activation"
d) "Neural pathways remain unchanged"
Answer: a) "Neurons that fire together, wire together"
47. The learning mechanism in artificial neural networks is inspired by:
a) Hebbian Learning
b) Logical reasoning
c) Symbolic AI
d) Rule-based programming
Answer: a) Hebbian Learning
Applications and Future Trends
48. Which of the following is a major limitation of biological neural networks compared to artificial neural networks?
a) Biological neurons are slower in computation
b) Biological neurons cannot learn
c) Biological neurons require large datasets
d) Biological neurons are not energy efficient
Answer: a) Biological neurons are slower in computation
49. Artificial neural networks outperform biological neural networks in:
a) Energy efficiency
b) Parallel processing
c) Speed of computation
d) Cognitive flexibility
Answer: c) Speed of computation
50. Which AI-based technology is closest to mimicking human cognition?
a) Convolutional Neural Networks (CNNs)
b) Spiking Neural Networks (SNNs)
c) Support Vector Machines (SVMs)
d) Decision Trees
Answer: b) Spiking Neural Networks (SNNs)
51. The field that aims to create computers inspired by the human brain is called:
a) Artificial Intelligence
b) Neuroscience
c) Neuromorphic Computing
d) Data Science
Answer: c) Neuromorphic Computing
52. Which of the following is an example of an application of Artificial Neural Networks?
a) Image recognition
b) Speech recognition
c) Medical diagnosis
d) All of the above
Answer: d) All of the above
53. Future advancements in neural networks may involve:
a) Energy-efficient AI models
b) Hybrid biological-computational networks
c) Quantum neural networks
d) All of the above
Answer: d) All of the above
Basic Concepts and Structure of Biological Neurons
54. What is the primary function of biological neurons in the human brain?
a) Storing data
b) Transmitting and processing information
c) Managing blood circulation
d) Regulating hormone production
Answer: b) Transmitting and processing information
55. The part of a biological neuron that receives incoming signals is called:
a) Axon
b) Soma
c) Dendrite
d) Synapse
Answer: c) Dendrite
56. The axon of a neuron is responsible for:
a) Receiving signals
b) Transmitting signals to other neurons
c) Processing information
d) Storing genetic material
Answer: b) Transmitting signals to other neurons
57. The junction between two neurons where information is transmitted is called:
a) Soma
b) Synapse
c) Myelin sheath
d) Dendrite
Answer: b) Synapse
58. The myelin sheath in biological neurons functions to:
a) Increase the speed of electrical signal transmission
b) Store neurotransmitters
c) Decrease synaptic connections
d) Generate new neurons
Answer: a) Increase the speed of electrical signal transmission
59. In biological neurons, information is transmitted through:
a) Chemical signals only
b) Electrical impulses and chemical signals
c) Heat energy
d) Mechanical signals
Answer: b) Electrical impulses and chemical signals
60. The space between two neurons where neurotransmitters are released is called:
a) Axon terminal
b) Neural gap
c) Synaptic cleft
d) Dendritic spine
Answer: c) Synaptic cleft
Neural Activity and Learning in Biological Neurons
61. Neurotransmitters in biological neurons function to:
a) Store information
b) Strengthen axon connections
c) Facilitate signal transmission between neurons
d) Repair damaged neurons
Answer: c) Facilitate signal transmission between neurons
62. The ability of neurons to strengthen or weaken over time in response to activity is called:
a) Neuroplasticity
b) Neurotransmission
c) Synaptic blocking
d) Axonal degradation
Answer: a) Neuroplasticity
63. The process of long-term potentiation (LTP) in biological neurons is associated with:
a) Forgetting old information
b) Strengthening of synaptic connections
c) Reducing neuron activity
d) Signal blockage
Answer: b) Strengthening of synaptic connections
64. The biological learning process in neurons is similar to which machine learning concept?
a) Feature extraction
b) Weight adjustment in neural networks
c) Overfitting prevention
d) Data augmentation
Answer: b) Weight adjustment in neural networks
65. Which type of neuron is responsible for transmitting signals from the brain to muscles?
a) Sensory neuron
b) Motor neuron
c) Interneuron
d) Glial cell
Answer: b) Motor neuron
Comparison of Biological and Artificial Neurons
66. What is the main difference between biological and artificial neurons?
a) Artificial neurons process signals faster
b) Biological neurons use electrical circuits
c) Artificial neurons can regenerate
d) Biological neurons cannot form networks
Answer: a) Artificial neurons process signals faster
67. In artificial neural networks, weights correspond to what in biological neurons?
a) Axons
b) Synaptic strengths
c) Soma
d) Myelin sheath
Answer: b) Synaptic strengths
68. The activation function in artificial neural networks is similar to:
a) The chemical reaction in synapses
b) The physical structure of neurons
c) The decision-making process of biological neurons
d) The number of neurons in the brain
Answer: c) The decision-making process of biological neurons
69. The concept of backpropagation in artificial neural networks is inspired by:
a) The way neurons transmit electrical signals
b) The way synapses adjust based on experience
c) The rapid regeneration of neurons
d) The formation of new brain cells
Answer: b) The way synapses adjust based on experience
70. The synaptic pruning process in biological neurons is similar to which concept in machine learning?
a) Data normalization
b) Feature selection and regularization
c) Cross-validation
d) Data augmentation
Answer: b) Feature selection and regularization
Neural Networks and Machine Learning Applications
71. Which machine learning model is directly inspired by biological neurons?
a) Decision Trees
b) Support Vector Machines (SVMs)
c) Artificial Neural Networks (ANNs)
d) K-Means Clustering
Answer: c) Artificial Neural Networks (ANNs)
72. Which of the following deep learning models mimics the visual processing system of the brain?
a) Recurrent Neural Networks (RNNs)
b) Convolutional Neural Networks (CNNs)
c) Support Vector Machines (SVMs)
d) Bayesian Networks
Answer: b) Convolutional Neural Networks (CNNs)
73. Spiking Neural Networks (SNNs) attempt to mimic:
a) The binary operation of traditional computers
b) The energy-efficient communication of biological neurons
c) The structure of decision trees
d) Statistical probability distributions
Answer: b) The energy-efficient communication of biological neurons
74. Unlike artificial neural networks, biological neurons:
a) Have fixed learning rates
b) Can continuously adapt without explicit training datasets
c) Process data sequentially
d) Only communicate in binary signals
Answer: b) Can continuously adapt without explicit training datasets
75. Which aspect of biological neurons is missing in artificial neural networks?
a) Parallel processing
b) Chemical signal processing
c) Learning ability
d) Weighted connections
Answer: b) Chemical signal processing
Advanced Applications and Future Trends
76. Which technology aims to replicate the structure and function of biological neurons in hardware?
a) Digital computing
b) Neuromorphic computing
c) Quantum computing
d) Traditional AI models
Answer: b) Neuromorphic computing
77. Which neural network architecture is most similar to biological neurons?
a) Feedforward Neural Networks
b) Spiking Neural Networks (SNNs)
c) Logistic Regression
d) Reinforcement Learning Models
Answer: b) Spiking Neural Networks (SNNs)
78. The energy efficiency of biological neurons compared to artificial neural networks is due to:
a) Faster computation
b) Parallel processing with minimal energy consumption
c) Large memory storage
d) Use of electrical signals only
Answer: b) Parallel processing with minimal energy consumption
79. What future advancement could make artificial neural networks more like biological neurons?
a) Using larger datasets
b) Developing hardware with neuromorphic computing principles
c) Increasing network depth
d) Implementing more activation functions
Answer: b) Developing hardware with neuromorphic computing principles
80. The ultimate goal of neuromorphic AI research is to:
a) Replace human brains
b) Mimic the brain’s learning, adaptability, and energy efficiency
c) Improve traditional computer storage
d) Build faster internet connections
Answer: b) Mimic the brain’s learning, adaptability, and energy efficiency
Basic Concepts of Single-Neuron Models
81. The simplest model of an artificial neuron is called:
a) Perceptron
b) CNN
c) RNN
d) Autoencoder
Answer: a) Perceptron
82. In artificial neurons, the weighted sum of inputs is passed through a:
a) Learning function
b) Activation function
c) Backpropagation algorithm
d) Feature extraction method
Answer: b) Activation function
83. What is the main function of a single-neuron model in machine learning?
a) Store large amounts of data
b) Perform simple decision-making based on weighted inputs
c) Identify patterns in unlabeled data
d) Predict future trends in time series
Answer: b) Perform simple decision-making based on weighted inputs
84. The McCulloch-Pitts neuron model operates using which type of activation function?
a) Sigmoid
b) Step function
c) ReLU
d) Softmax
Answer: b) Step function
85. The threshold function in a McCulloch-Pitts neuron decides:
a) The weight values of each input
b) Whether the neuron "fires" (outputs 1) or remains inactive (outputs 0)
c) The backpropagation learning rate
d) The total number of neurons in the network
Answer: b) Whether the neuron "fires" (outputs 1) or remains inactive (outputs 0)
86. What limitation does a single-layer perceptron have?
a) It cannot model linear functions
b) It cannot solve non-linearly separable problems
c) It requires multiple hidden layers
d) It is too computationally expensive
Answer: b) It cannot solve non-linearly separable problems
Neuron Models and Activation Functions
87. Which activation function is commonly used in deep learning because it allows gradients to flow efficiently?
a) Step function
b) ReLU (Rectified Linear Unit)
c) Hard limit function
d) Tanh
Answer: b) ReLU (Rectified Linear Unit)
88. The sigmoid activation function is commonly used in:
a) Regression tasks
b) Binary classification problems
c) Unsupervised learning
d) Clustering algorithms
Answer: b) Binary classification problems
89. What is a drawback of using the sigmoid function in neural networks?
a) It is non-differentiable
b) It suffers from vanishing gradients for very large or small input values
c) It does not allow non-linearity in models
d) It is computationally expensive
Answer: b) It suffers from vanishing gradients for very large or small input values
90. The tanh activation function differs from the sigmoid function because:
a) It only outputs values between 0 and 1
b) It is not differentiable
c) It has an output range of -1 to 1, making it zero-centered
d) It cannot be used for classification
Answer: c) It has an output range of -1 to 1, making it zero-centered
Mathematical Models of Neurons
91. The mathematical representation of a single artificial neuron is given by:
a)
b)
c)
d)
Answer: a)
92. In the perceptron model, weights are adjusted using which learning rule?
a) Gradient descent
b) Hebbian learning
c) Perceptron learning rule
d) Genetic algorithm
Answer: c) Perceptron learning rule
93. The linear activation function is mainly used in:
a) Classification tasks
b) Regression problems
c) Clustering
d) Reinforcement learning
Answer: b) Regression problems
94. Which activation function allows both positive and negative weighted sums while being computationally simple?
a) ReLU
b) Sigmoid
c) Tanh
d) Step function
Answer: c) Tanh
Advanced Neuron Models and Learning
95. The Hebbian learning rule states that:
a) Neurons that fire together, wire together
b) Errors decrease with each training iteration
c) Backpropagation is the best learning method
d) A single neuron is sufficient for deep learning
Answer: a) Neurons that fire together, wire together
96. The Leaky ReLU function was introduced to overcome which issue in standard ReLU?
a) Vanishing gradients
b) Exploding gradients
c) Dying ReLU problem
d) Overfitting
Answer: c) Dying ReLU problem
97. Which type of neuron model introduces stochastic behavior into activation?
a) Perceptron
b) Boltzmann machine
c) Deep feedforward network
d) Softmax regression
Answer: b) Boltzmann machine
98. Which model allows neurons to have time-dependent activations, making them useful for sequence learning?
a) Recurrent Neural Networks (RNNs)
b) Convolutional Neural Networks (CNNs)
c) Perceptrons
d) Autoencoders
Answer: a) Recurrent Neural Networks (RNNs)
Practical Applications of Single-Neuron Models
99. A single-layer perceptron can be used to classify:
a) XOR function
b) Linearly separable problems
c) Image data
d) Time-series predictions
Answer: b) Linearly separable problems
100. The biological equivalent of the activation function in an artificial neuron is:
a) The dendrite
b) The synapse
c) The action potential threshold
d) The axon
Answer: c) The action potential threshold
101. In machine learning, what is the role of bias (b) in a single-neuron model?
a) It determines the number of inputs
b) It introduces flexibility by shifting the activation function
c) It replaces the weight parameter
d) It removes the need for backpropagation
Answer: b) It introduces flexibility by shifting the activation function
102. Which of the following is NOT a characteristic of a single-layer perceptron?
a) It uses a linear decision boundary
b) It can solve XOR problems
c) It can classify AND/OR logic gates
d) It learns weights using supervised learning
Answer: b) It can solve XOR problems
103. The Widrow-Hoff learning rule (or Delta rule) is a modified version of which method?
a) Gradient Descent
b) Hebbian Learning
c) Reinforcement Learning
d) Decision Tree Algorithm
Answer: a) Gradient Descent
104. If a perceptron fails to converge, it means:
a) The dataset is too small
b) The problem is non-linearly separable
c) The weights are too large
d) The learning rate is too high
Answer: b) The problem is non-linearly separable
Basic Concepts of Neural Network Models
105. What is a neural network model primarily used for in machine learning?
a) Storing large amounts of data
b) Learning patterns from data and making predictions
c) Performing arithmetic calculations
d) Running operating systems
Answer: b) Learning patterns from data and making predictions
106. Which of the following is a basic building block of artificial neural networks?
a) Neurons
b) Data clusters
c) Feature maps
d) Activation layers
Answer: a) Neurons
107. A feedforward neural network (FNN) is different from other models because:
a) It allows feedback loops
b) It processes information in one direction only
c) It does not use activation functions
d) It is always a deep network
Answer: b) It processes information in one direction only
108. Which neural network model is best suited for image recognition tasks?
a) Recurrent Neural Networks (RNNs)
b) Convolutional Neural Networks (CNNs)
c) Feedforward Neural Networks (FNNs)
d) Boltzmann Machines
Answer: b) Convolutional Neural Networks (CNNs)
Feedforward and Convolutional Neural Networks (CNNs)
109. What is the primary advantage of CNNs over traditional feedforward networks?
a) CNNs have fewer layers
b) CNNs are better at processing sequential data
c) CNNs automatically detect spatial patterns in data
d) CNNs do not require training
Answer: c) CNNs automatically detect spatial patterns in data
110. What is the main function of pooling layers in CNNs?
a) To increase computational cost
b) To downsample feature maps and reduce dimensionality
c) To perform backpropagation
d) To add more neurons to the network
Answer: b) To downsample feature maps and reduce dimensionality
111. Which type of pooling is most commonly used in CNNs?
a) Max pooling
b) Average pooling
c) Min pooling
d) Global pooling
Answer: a) Max pooling
112. The main difference between a shallow and a deep neural network is:
a) The number of input neurons
b) The number of hidden layers
c) The size of the dataset
d) The activation function used
Answer: b) The number of hidden layers
Recurrent Neural Networks (RNNs) and LSTMs
113. Which neural network is best suited for processing sequential data like time series or speech?
a) Convolutional Neural Networks (CNNs)
b) Feedforward Neural Networks (FNNs)
c) Recurrent Neural Networks (RNNs)
d) Self-Organizing Maps (SOMs)
Answer: c) Recurrent Neural Networks (RNNs)
114. What problem do standard RNNs suffer from?
a) Vanishing and exploding gradients
b) Inability to process sequential data
c) Lack of non-linearity
d) Overfitting on large datasets
Answer: a) Vanishing and exploding gradients
115. Which specialized RNN architecture helps overcome the vanishing gradient problem?
a) Deep Belief Networks (DBNs)
b) Long Short-Term Memory (LSTM) networks
c) Perceptrons
d) Radial Basis Function Networks
Answer: b) Long Short-Term Memory (LSTM) networks
116. In LSTM networks, what is the role of the forget gate?
a) To forget outdated information from previous steps
b) To store data in long-term memory
c) To normalize inputs
d) To perform convolution operations
Answer: a) To forget outdated information from previous steps
117. A Gated Recurrent Unit (GRU) is different from LSTMs because:
a) GRUs have separate memory cells
b) GRUs use fewer parameters and are computationally efficient
c) GRUs require more training data
d) GRUs cannot process sequential data
Answer: b) GRUs use fewer parameters and are computationally efficient
Self-Organizing Maps (SOMs) and Autoencoders
118. A Self-Organizing Map (SOM) is primarily used for:
a) Supervised learning tasks
b) Unsupervised learning and clustering
c) Reinforcement learning
d) Data augmentation
Answer: b) Unsupervised learning and clustering
119. Autoencoders are used for:
a) Feature extraction and dimensionality reduction
b) Time series prediction
c) Reinforcement learning
d) Clustering
Answer: a) Feature extraction and dimensionality reduction
120. In autoencoders, the bottleneck layer is responsible for:
a) Increasing the number of neurons
b) Compressing the data representation
c) Performing backpropagation
d) Adding extra features
Answer: b) Compressing the data representation
Generative and Deep Learning Models
121. What is the primary function of Generative Adversarial Networks (GANs)?
a) Classification of images
b) Generating new data that resembles training data
c) Performing regression tasks
d) Reducing computational costs
Answer: b) Generating new data that resembles training data
122. A GAN consists of which two neural networks?
a) Classifier and Regressor
b) Generator and Discriminator
c) Encoder and Decoder
d) CNN and RNN
Answer: b) Generator and Discriminator
123. Which neural network model is commonly used in text generation?
a) CNN
b) RNN
c) Transformer
d) Autoencoder
Answer: c) Transformer
Advanced and Specialized Neural Network Models
124. Transformers are widely used in:
a) Image classification
b) Sequence processing and NLP tasks
c) Clustering
d) Feature extraction
Answer: b) Sequence processing and NLP tasks
125. The attention mechanism in transformers helps by:
a) Allowing the model to focus on relevant parts of input data
b) Improving activation functions
c) Reducing training time
d) Making the network shallow
Answer: a) Allowing the model to focus on relevant parts of input data
126. A Deep Belief Network (DBN) is different from standard neural networks because:
a) It is only used for classification tasks
b) It uses multiple layers of unsupervised learning
c) It has no hidden layers
d) It cannot be trained
Answer: b) It uses multiple layers of unsupervised learning
127. Which neural network is commonly used in reinforcement learning?
a) Deep Q-Networks (DQN)
b) CNN
c) LSTM
d) Autoencoder
Answer: a) Deep Q-Networks (DQN)
128. Which deep learning model is commonly used in robotics and autonomous systems?
a) CNN
b) Reinforcement Learning Models
c) GANs
d) Autoencoders
Answer: b) Reinforcement Learning Models
No comments:
Post a Comment