Click here to close now.

SYS-CON MEDIA Authors: VictorOps Blog, Tim Hinds, Mike Kavis, Plutora Blog, Andreas Grabner

News Feed Item

Discovery Harbour Enters Contract with Aeroquest for the Surveying of the 2BAR Project Area in Nevada

VANCOUVER, BRITISH COLUMBIA -- (Marketwired) -- 07/24/14 -- Discovery Harbour Resources Corp ("DHR") (TSX VENTURE: DHR) is pleased to announce that it continues to advance its 2BAR project in Nevada with the contracting of Aeroquest Airborne, a Geotech company, for the purpose of surveying the project area.

The contracted survey is a high resolution, 3-axis magnetic gradiometer survey over an area of approximately 11 miles (17.7 kilometers) by 5 miles (8.1 kilometers). The line spacing is planned to be 200 meters with perpendicular tie lines every 250 meters to produce a reasonably tight grid cell arrangement. Aeroquest has projected the survey to begin on August 1, 2014. Barring mechanical or weather-related issues, a survey time of approximately 4 days is further projected. DHR will have access to newly developed data on a daily basis. This will expedite the interpretation and target selection processes.

Observations made during the due diligence field trip to the project area, conducted at the end of June, indicated that the high grade copper-silver mineralization sampled in outcrop and workings is hosted in two types of lithologies, fault breccias (tectonic breccias) and amygdaloidal flow tops, both occurring in the Jurassic andesitic (intermediate volcanic) units. The amygdaloidal flow units are vertically-stacked lithologies (layer cake configuration). The tectonic breccias units were noted to be developed at the intersections of high angle normal and reverse block faults where they encountered other block fault contacts, shear zones and intrusive (diorite-gabbro) contacts.

A strong contrast in magnetic susceptibilities exists among the rock units exposed in the survey area and this contrast will enable the mapping of the geology of this region using the magnetic datasets. No detailed geological or structural mapping has been undertaken for this area. Strongly magnetic mafic intrusive rocks of the Humboldt Complex are contrasted with weakly to non-magnetic Tertiary felsic volcanics. The targeted Jurassic andesitic units are moderately magnetic. Using the gradiometer system, these contrasts will become enhanced and discernible in plan view. Additionally, the gradiometer system will produce a dataset allowing deeply buried lithologies to be detected and mapped. This will also have implications for ore fluid genesis.

The structural complexity of the area was evident from the field visit. Pronounced structural lineaments associated with the block faulting, drastic changes in attitudes (strikes and dips of units, lateral offsets and tilting) of the Jurassic sequences and dynamic outcrops of vertical and sub-vertical shear zones in the volcanics all appear to be related to, and the conduits for, ascending fluids producing the mineralization. When combined with known copper occurrences and new occurrences found in the course of diligence review and staking, the structural information produced from the gradiometer dataset will also serve to more accurately locate targets for next phase, drilling, when combined with all data presently in hand. The drilling program will be announced after the survey results have been evaluated.

Michael J. Senn, a licensed professional geologist, is the Qualified Person for Discovery Harbour Resources as described in National Instrument 43-101 and has reviewed and approved the technical contents of this release.

ON BEHALF OF THE BOARD OF DISCOVERY HARBOUR RESOURCES CORP.

Frank D. Hegner, President, CEO, and Director

Disclaimer for Forward-Looking Information

Forward Looking Statement

Certain information regarding the Company contained in this press release may constitute forward-looking statements within the meaning of applicable securities laws. Forward-looking statements may include estimates, plans, opinions, forecasts, projections or other statements that are not statements of fact. Although the Company believes that expectations reflected in such forward-looking statements are reasonable, it can give no assurance that such expectations will prove to have been correct. The Company cautions that actual performance will be affected by a number of factors, many of which are beyond the Company's control, and that future events and results may vary substantially from what the Company currently foresees.

Neither TSX Venture Exchange nor its Regulation Services Provider (as that term is defined in the policies of the TSX Venture Exchange) accepts responsibility for the adequacy or accuracy of this release.

Contacts:
Discovery Harbour Resources Corp
Kieran Magee
(604) 689-1799
(604) 689-8199 (FAX)

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, includin...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...