The growing popularity of short-form video platforms and their reliance on algorithmic recommendations highlights the risk of viewers unintentionally encountering distressing content. Therefore, we investigated viewers' experiences with distressing content and developed design approaches to alleviate their discomfort. Through in-depth interviews, we discovered that participants perceived and reacted differently to "socially inappropriate content," which violated societal norms, and to "personally discomforting content," which triggered negative reactions on a personal level. Further, participants expressed frustration with the lack of transparency in content reporting processes, the challenges in tailoring recommendation algorithms to avoid distressing content, and the limitations of post-exposure feedback mechanisms. To address these challenges, we conceptualized three design approaches focused on enhancing reporting process transparency, providing users with granular control over content recommendations, and allowing for preemptive adjustments to their content feeds. Our findings and proposed design approaches may provide valuable directions for improving viewer well-being on short-form video platforms.