Please enable JavaScript.
Coggle requires JavaScript to display documents.
Keith Grainge - Array design (Design choice areas (Telescope location…
Keith Grainge - Array design
Why interferometer
Angular resolution higher
Better control of systematic errors
Lower cost of a large number of small antennas, with the same collecting area as a single large antenna
Telescope objectives
Aim for telescope
Single specific measurement
General purpose
expected lifetime
Does an existing facility already deliver this capability? why not use it?
Is an interferometer the right instrument for this objective
Requirements
derive technical requirements from the telescope objective
Many will be dependent on eachother
Multiple solutions can achieve the same objective
Similar sounding objectives give different performance
Optimisation to determine design choices will be needed
Cost likely to be deciding factor
Design choice areas
Angular scale of interest
max and min scale size measured - larger objects are resolved out
Weighting of visabilities in the aperture plane determines point spread function so it is possible to tailor angular size sensitivity
Observing frequency
Driver for spectral line measurements
Continuum observation driven by source spectral index
Choice of bandwidth
Bandwidth improves continuum senitivity and allows measurement of spectral index
Wider bandwidth allows more lines to be measured
Choice of channel bandwidth
Mostly unimportant for continuum
Ensure sufficient resolution for line observations
Sensitivity
Blockage by secondary, supports etc.
illumination efficiecy
Surface errors, ruze formula
Higher frequency observations all higher bandwidths
Combining more than 30% fractional bandwidth into one image becomes problematic
Telescope location
Height above sea level important for precipitable water vapour content - ideally above the inversion layer
Radiofrequency interference from human habitation (a ground sheild or surrounding mountains can hepl
Ease of access to site
Infrastructure to run telescope
Local (expert) support for operations
Which regions of sky are accessable
Tropospheric stability
Ionospheric stability
Fraction of days when rain/wind/snow make observing impossible
The relative importance of the above issues depends critically on frequency
Receptor type
Dish antenna
Issues
Many different choices of optics
offset/on axis; effects of blockage
prime/secondary focus (or tertiary)
Cassegrain/Gregorian/Dragone
Mounting arragement e.g. Alt/Az
elevation range
Pointing accuracy
Note that source positions on map depend upon visibility phase not antenna pointing
Antenna slew and settle rate
Reduce time lost driving to next field
Calibration cadence can enforce requirements
Alternatively use transient instrument
Antenna size
Larger area gives higher sensitivity
More expensive with bigger size (d^3)
Larger dishes require higher pointing accuracy
Surface accuracy - Ruze formula
At low frequency can use mesh rather than solid dishes
Aperture array
station of several individual elements
electronics apply complex weights and sum signals
Form multiple beams on sky
Same as a dish bringing radiation to a focus
Particularly good at low frequency
Feed horn
Very well controlled beam, low sidelobes
Expensive to give large collecting area
Cylinders
Metal reflector form beam in 1D
Electronics can form beams in orthogonal direction
Radio frequency interference
Astronomical radio signals are weak even for the strongest sources of interest
Many sources of man made RFI
Satalite down-links
Airport/military radar
Mobile phones
Television and radio transmitters
Sparkling from power transmission
RFI signals can be time variable and both narrow and relatively wideband
Waterfall plot of f/t used to identify and flag RFI
Interferometers help with RFI rejection
Only detect correlated rfi
Signals map to north or south pole
Self induced RFI
Sub-systems on the telescope itself can cause RFI
Any digital electronics (digitisers, correlator, computers
Drives and encoders
Power supplies
Screen individual components
Locate computers and correlator inside a screened room (Faraday cage)
Optical fibre rather than coax
Configuration of the telescope
Baseline length determines which scales are measured
The layout or configuration of the antennas in an interferometer determine sampling in the uv plane
To maximise filling of the uv-plane, ensure no redundant baselines (any structure in the configuration will be reflected in the psf)
Scattering antennas randomly over a wide observatory area has a large overhead in power, data transport, road etc. infrastructure
Reconfigurable antennas allows great flexibility but significant infrastructure overhead and observing time loss
Logarithmically space antennas along pirals from a centra core give a good compromise
Redundant baselines do offer additional possibilities for calibration
System temperature
Dominated by 1st amplifier in receiver chain and losses before this amplifier
Generally cryogenicly cooled before this amplifier
Spillover due to idelobes of the telescope beam hitting arm ground
Depends upon illumination, blockage and zenith angle
Survey speed
Not relevant if aim is to observe individual, discrete objects
Large area,high frequency surveys are difficult
going to deeper depth quickly becomes prohibitive
Surface brightness sensitivity
If interested in imaging a large angular scale feature, need sensitivity on hort baselines
Need compact array of telescopes
Shadowing (looking into the back of another telescope at some angles) becomes an issue
Polarisation
Receivers can be intrinsically sensitive to either linear or circular polarisation
Mesuring both orthogonal polarisations give 2 independent measurements of tokes
Requires a receiver chain or both and four correltors for each baseline
Correlator
Normally digital these days but analogue is possible
Niquist sample the time-stream data and digitise signal with n bits of resolution
Split into frequency channels
Apply delays, correlate and integrate over time
Size of correlator determined by number of baselines, number of frequency channels (not linea dependence
Different correlator modes (e.g. frequency zoom) increases flexibility and complexity
Compute requirements
For mapping, need to irst grid all the visibilities
If we desire to map entire FoV then can find number of pixels in image
Rarely do this in VLBI
Choose image size in pixels to be 2^n to allow use of fft
To avoid time averaging smearing, need correlator integration time less than maths
To avoid frequency smearing, need frequency channel bandwidth less than maths
Wide field mapping requires w correction (see anna's talk) - additional computational difficulties
Net effect of all of the above is that a continuum survey mapping experiment with a large Nant, small d and large Dmax can quickly become a Big Data challenge