robots.txt Disallow does not guarantee that a page will not appear in results: Google may still decide, based on external information such as incoming links, that it is relevant. If you wish to explicitly block a page from being indexed, you should instead use the noindex robots meta tag or X-Robots-Tag HTTP header.
Seems like you should be able to throw <META NAME="ROBOTS" CONTENT="NOINDEX"> into the Head tag section with no problem … as noted in the UI “code included here will only apply to this page”