We consider a Neyman-Pearson (NP) distributed binary detection problem in a bandwidth constrained wireless sensor network, where the fusion center (FC) makes a final decision about the presence or absence of a known signal in correlated Gaussian noises. Our goals are (i) to investigate whether or not randomized transmission can improve detection performance, under communication rate constraint, and (ii) to explore howthe correlation among observation noises impacts performance. We propose two novel schemes that combine the concepts of censoring and randomized transmission (which we name CRT schemes) and compare them with pure censoring scheme. In CRT (pure censoring) schemes we map randomly (deterministically) a sensor's observation to a ternary transmit symbol u(k) is an element of {-1, 0, 1} where "0" corresponds to no transmission (sensor censors). We model the randomization in CRT schemes using two independent Bernoulli random variables with parameters g, f. Assuming sensors transmit over orthogonal fading channels, we formulate and address two system-level constrained optimization problems: in the first problem we minimize the probability ofmiss detection at the FC, subject to constraints on the probabilities of transmission and false alarm at the FC; in the second (dual) problemwe minimize the probability of transmission, subject to constraints on the probabilities of miss detection and false alarm at the FC. The optimization variables include g, f. Both problems are non-convex and their solutions can be found via exhaustive search. Seeking to shed some light on the qualitative behavior of randomization, we propose methods to find sub-optimal solutions of the original problems and study the relation between these solutions. Through analysis and numerical evaluations, we explore and provide the conditions under which CRT schemes outperform pure censoring scheme.