This study investigated the current issues of the exercise physiology website's service quality and contents by experts’ and users' satisfaction indicators and to compare experts’ and users' satisfaction between pre- and post-website correction. The frames of experts' and users' satisfaction indicators in our study were modified by Lin (2002) which included three aspects: contents (i.e., suitability, accuracy, credibility, attraction, update and academic topics), programming (i.e., layout arrangement and prettily impression, interface design, quality of links, style of interaction, and assisted learning tools) and special functions (i.e., member's function, e-paper of exercise physiology, net foregather activity and information downloading). Seven exercise physiology experts and three network experts participated in the evaluation of the website. A users' satisfaction questionnaire was filled online, with a focus on the 2,401 members of the exercise physiology website. While making experts' and users' satisfaction indicators pre-evaluation, according to their scores and experts' suggestions, we renovated the website and its contents. After the renovation, we used the experts' and users' satisfaction indicators again to do the website post-evaluation. The scores of experts' and users' pre- and post-evaluation were used for statistical analyses. By this way, we could learn the satisfaction difference in pre- and post-website correction. The satisfaction scores of contents in experts' scores in pre- and post-evaluation and in users' scores both showed significant improvement. The experts' scores were significant higher than the users'. In programming and special functions, both experts' scores in pre- and post-evaluation and users' scores showed significant improvement. The scores in programming and special functions showed no significant difference between experts and users. Experts' and users' satisfactory evaluation showed significant improvement in contents, programming and special functions; that is, the website's correction was useful. Since experts and users made different suggestions about website evaluation, it would be biased if only one side's only one's opinions were adopted. It would therefore be better to combine both experts' and users' suggestions