Click here to close now.

SYS-CON MEDIA Authors: Tim Hinds, Mike Kavis, Plutora Blog, Andreas Grabner, Dana Gardner

News Feed Item

Excalibur Resources Ltd.: Cariboo Property Update

TORONTO, ONTARIO -- (Marketwire) -- 02/08/13 -- Excalibur Resources Ltd. ("Excalibur" or the "Company") (CNSX:XBR)(FRANKFURT:X9CN) is pleased to provide an update on its 2012 work program at its Cariboo, Princess and Cleopatra properties in south-eastern B.C., approximately 7 kilometres south of Nelson.

From 2008 to 2011, Excalibur conducted mapping and sampling programs in the area, the results of which lead to a 2011 trenching program. The 2012 trenching and drilling program on the Cariboo property was a continuation of exploration work to follow up positive results from the previous work.

In 2012, Excalibur completed two trenches where four chip samples were taken, and 29 BQTW size diamond drill holes totalling 1,469 metres. Most of this work was designed to fully assess the potential of the gold-silver bearing quartz carbonate vein system described as the Cariboo Vein. A very brief prospecting program was conducted on both the Cleopatra and Princess claims. This included the acquisition of five rock samples from the Princess claim and a single sample from the Cleopatra claim. Samples of rocks taken from some of the old workings on the Princess claim returned elevated copper and gold values, with over 1% copper, 40 grams per tonne silver and 0.4 grams per tonne gold from sample PR12-02. The sample taken from the Cleopatra claim (sample CLEO12-01) did not return any significant values.

The Cariboo Vein extends along strike for approximately 120 metres and down-dip for a distance of 80 metres below surface. This target vein was intercepted numerous times during the 2012 drill program. Many of the multiple intercepts from drilling can be composited into wider intercept widths - intercepts of 2 metres to 5.5 metres width. Diamond drill core sampling assay results indicate a strong correlation between elevated silver and elevated gold values. Mineralization also includes lead, zinc and in places copper. Hole 12Car-16 provided the highest gold value with 11.1 grams per tonne (plus 92 grams per tonne silver) over 1.28 metres. The most significant results were press released on November 29th, 2012 and can be found on www.sedar.com or on Excalibur's web site at www.excaliburresources.ca. The vein is open to depth, and could be explored for its south extent.

Excalibur is one of a number of companies in the region that have found gold including the following immediate neighbours of Excalibur:


--  Sultan Minerals Inc. which has a joint venture with Altair Gold Inc.
    which has 1 million+ measured, indicated and inferred gold ounces. 
--  Valterra Resources Corp. which has a number of gold intercepts including
    4 grams per tonne Au + 9 grams per tonne Ag over 24.3 metres. 
--  Emgold Corp which drilled 1495 metres in 2012 with intercepts including
    11.1 grams per ton gold over 1.45 metres.

Along with much exploration activity this past year, the region also experienced merger and acquisition (M&A) activity. A few kilometers to the north of the Cariboo, Anglo Swiss Resources Inc. recently announced the sale of its Kenville Gold Property to a private company. Sultan Minerals Inc. entered a joint venture on a property neighbouring Excalibur's Cariboo Property. We expect more activity in the region, which not only reflects the difficult financing conditions for junior miners, but also the ample opportunities there. Excalibur's management views this region very favorably where over 5 million ounces of gold and 30 million ounces of silver have been mined, which is not only endowed with minerals and has excellent infrastructure, but is also steeped in mining tradition with an experienced workforce.

Tim Gallagher states that "We are pleased with the results of our 2012 work program, verifying what we had expected. We have just received the final Assessment Report from our consulting geologist on this year's activities. There are now a number of options that we can pursue including bulk sampling, additional exploration, property acquisition, or any combination thereof, but most likely we will await further development by our neighbours as Mexico is our primary focus."

Perry Grunenberg, P.Geo. is a "Qualified Person" for the purpose of National Instrument 43-101, and has reviewed and verified the technical contents of this news release.

Excalibur Resources Ltd. is a junior exploration mining company focused on the discovery, development and mining of economically viable precious metal mineral resources.

On behalf of the Board of Directors:

Tim Gallagher, Chairman & CEO

Neither the Canadian National Stock Exchange nor its Regulation Services Provider accepts responsibility for the adequacy or accuracy of this release.

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, includin...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...