In the current study,two experiments were adopted to investigate people’s moral judgment about human and robot agents in personal/impersonal dilemmas.The results showed that:(1) in impersonal dilemma(autonomous vehicle dilemma),people applied same moral norms to human and robot agents.They had the same expectation about what action the agents should do(utilitarian) and the same moral evaluation(of permissibility,rightness,blame) about the agents’ actual action.(2) In personal dilemma(footbridge dilemma),people applied different moral norms to human and robot agents.Despite that overall people wanted both human and robot agent choice deontology action,while compared with human,robot agents were more expected to take the utilitarian action,and they were given higher evaluation(higher permissibility,and lower blame) than their human counterparts about their utilitarian action.