As computers become smarter, stronger and more woven into the fabric of the world, it is increasingly important to make sure they don’t autonomously cause harm. Even better, they should support health and welfare. But how can technology know what’s good for others, especially when people so often get it wrong? One way is for computers to gauge emotional responses to their actions. Another is if machines had genuine feelings akin to our own. Taken together, these faculties could provide the foundation for exploring artificial empathy. Towards this effort, I build robots of various shapes that use haptic technology including sensors and actuators to physically engage people. Touch provides the basic mechanism for working out the relationship between morphology, actions, and the physical and emotional consequences including the recipient’s reactions. For example, if a machine is able to know what constitutes a blow, and when one has been given or received, then perhaps it can learn when if ever it is appropriate. With such robots, various programs can be explored to teach the Golden Rule: “Do unto others as you would have them do unto you”. In this talk, I will present some of my robots and discuss their repetitive modular design that tightly couples sensing with actuation. I will also show how they differ from related work.