
Asimov's Three Laws of Robotics
Asimov's Three Laws of Robotics are guidelines for robot behavior: First, a robot must not harm humans or allow them to come to harm. Second, a robot must obey human commands unless doing so would cause harm. Third, a robot must protect its own existence as long as it doesn't conflict with the first two laws. These rules aim to ensure robots are safe, helpful, and ethically aligned with human interests.